Apr 28 19:13:30.064789 ip-10-0-133-121 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 28 19:13:30.064799 ip-10-0-133-121 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 28 19:13:30.064806 ip-10-0-133-121 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 28 19:13:30.065064 ip-10-0-133-121 systemd[1]: Failed to start Kubernetes Kubelet. Apr 28 19:13:40.248353 ip-10-0-133-121 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 28 19:13:40.248369 ip-10-0-133-121 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e190c19dd9c94e17aca0858f8b65b271 -- Apr 28 19:15:42.968313 ip-10-0-133-121 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:15:43.329569 ip-10-0-133-121 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:43.329569 ip-10-0-133-121 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:15:43.329569 ip-10-0-133-121 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:43.329569 ip-10-0-133-121 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:15:43.329569 ip-10-0-133-121 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:43.331009 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.330906 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:15:43.333747 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333731 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:43.333747 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333747 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333750 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333754 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333757 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333760 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333762 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333765 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333767 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333773 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333776 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333779 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333782 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333785 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333788 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333790 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333793 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333796 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333798 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333801 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333804 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:43.333809 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333807 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333809 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333813 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333815 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333820 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333823 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333826 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333828 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333831 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333833 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333836 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333838 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333842 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333844 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333847 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333850 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333852 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333855 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333858 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333860 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:43.334300 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333863 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333866 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333869 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333872 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333874 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333877 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333879 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333883 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333887 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333890 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333892 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333895 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333899 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333903 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333908 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333911 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333914 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333917 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333919 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:43.334849 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333922 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333924 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333927 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333929 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333932 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333934 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333937 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333939 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333942 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333945 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333947 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333950 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333960 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333963 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333965 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333968 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333971 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333987 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333990 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:43.335342 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333993 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333996 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.333998 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334001 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334004 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334006 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334009 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334370 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334375 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334378 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334381 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334383 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334386 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334389 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334392 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334394 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334397 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334400 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334402 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334405 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:43.335800 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334408 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334410 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334413 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334415 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334418 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334421 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334424 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334427 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334430 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334432 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334435 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334438 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334441 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334443 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334446 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334449 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334452 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334454 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334457 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334461 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:43.336285 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334465 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334469 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334473 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334475 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334478 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334481 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334484 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334487 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334489 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334492 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334494 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334497 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334500 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334503 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334506 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334508 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334511 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334514 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334517 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:43.337157 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334519 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334522 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334525 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334527 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334530 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334533 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334535 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334538 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334541 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334543 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334546 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334549 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334551 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334554 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334557 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334559 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334562 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334564 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334567 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334570 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:43.337735 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334573 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334576 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334578 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334581 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334584 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334587 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334589 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334592 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334594 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334597 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334599 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334602 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334604 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.334607 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335944 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335954 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335960 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335965 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335969 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335984 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335989 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:15:43.338265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335994 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.335998 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336001 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336005 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336008 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336011 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336014 2565 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336017 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336020 2565 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336023 2565 flags.go:64] FLAG: --cloud-config="" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336026 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336029 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336034 2565 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336037 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336040 2565 flags.go:64] FLAG: --config-dir="" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336043 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336047 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336051 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336054 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336057 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336060 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336063 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336067 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336070 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336073 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:15:43.338783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336076 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336081 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336084 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336088 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336091 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336095 2565 flags.go:64] FLAG: --enable-server="true" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336098 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336102 2565 flags.go:64] FLAG: --event-burst="100" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336105 2565 flags.go:64] FLAG: --event-qps="50" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336108 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336111 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336114 2565 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336118 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336121 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336124 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336128 2565 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336131 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336134 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336137 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336140 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336144 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336147 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336150 2565 flags.go:64] FLAG: --feature-gates="" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336154 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336157 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:15:43.339405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336160 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336163 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336166 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336169 2565 flags.go:64] FLAG: --help="false" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336172 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336175 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336178 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336182 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336185 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336189 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336192 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336195 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336201 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336204 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336206 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336210 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336213 2565 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336216 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336219 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336222 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336225 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336228 2565 flags.go:64] FLAG: --lock-file="" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336231 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336234 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:15:43.340020 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336237 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336242 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336246 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336249 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336252 2565 flags.go:64] FLAG: --logging-format="text" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336255 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336258 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336261 2565 flags.go:64] FLAG: --manifest-url="" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336264 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336269 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336273 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336276 2565 flags.go:64] FLAG: --max-pods="110" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336279 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336283 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336286 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336289 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336292 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336295 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336299 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336349 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336387 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336391 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336395 2565 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:15:43.340603 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336398 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336406 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336908 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336931 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336939 2565 flags.go:64] FLAG: --port="10250" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336945 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336952 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e81bd3f98dc7bac7" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336959 2565 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336966 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336972 2565 flags.go:64] FLAG: --register-node="true" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.336993 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337005 2565 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337020 2565 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337025 2565 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337030 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337035 2565 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337041 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337046 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337052 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337062 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337067 2565 flags.go:64] FLAG: --runonce="false" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337072 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337078 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337083 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337088 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337093 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:15:43.341200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337098 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337108 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337113 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337118 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337123 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337128 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337133 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337138 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337143 2565 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337150 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337171 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337176 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337181 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337190 2565 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337195 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337200 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337204 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337215 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337220 2565 flags.go:64] FLAG: --v="2" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337226 2565 flags.go:64] FLAG: --version="false" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337233 2565 flags.go:64] FLAG: --vmodule="" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337239 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337246 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337686 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:43.341830 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337700 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337704 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337707 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337711 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337714 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337717 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337720 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337722 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337725 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337728 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337730 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337733 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337736 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337739 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337742 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337745 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337747 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337750 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337754 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337758 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:43.342413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337761 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337764 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337767 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337770 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337773 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337775 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337778 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337781 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337784 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337786 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337789 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337792 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337795 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337798 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337801 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337804 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337806 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337809 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337811 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337814 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:43.342936 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337817 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337819 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337822 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337825 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337827 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337830 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337832 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337835 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337838 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337841 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337843 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337846 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337849 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337851 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337854 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337856 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337859 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337862 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337864 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337867 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:43.343456 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337869 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337872 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337876 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337879 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337883 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337886 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337889 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337892 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337895 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337898 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337901 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337903 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337906 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337909 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337911 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337914 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337916 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337919 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337922 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:43.343948 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337924 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337927 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337930 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337932 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337935 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.337937 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.337942 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.344203 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.344217 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344277 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344281 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344285 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344288 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344292 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344295 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:43.344435 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344298 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344302 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344305 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344308 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344311 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344314 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344316 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344319 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344322 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344325 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344327 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344330 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344333 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344335 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344338 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344340 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344343 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344346 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344349 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344352 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:43.344850 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344354 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344357 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344360 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344362 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344365 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344369 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344372 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344375 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344378 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344380 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344383 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344385 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344388 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344391 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344393 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344396 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344399 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344401 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344404 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344407 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:43.345358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344410 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344413 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344415 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344418 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344421 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344423 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344426 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344428 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344431 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344434 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344436 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344439 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344442 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344444 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344447 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344450 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344452 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344456 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344459 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344462 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:43.345848 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344465 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344467 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344470 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344472 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344475 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344478 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344482 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344485 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344489 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344492 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344495 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344497 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344500 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344503 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344505 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344508 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344510 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344513 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344516 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:43.346358 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344518 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.344524 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344619 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344624 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344627 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344630 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344633 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344636 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344639 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344642 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344645 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344649 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344653 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344655 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344658 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:43.346858 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344660 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344663 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344666 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344669 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344671 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344674 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344677 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344679 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344683 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344685 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344688 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344691 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344694 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344696 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344699 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344702 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344705 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344707 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344710 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:43.347351 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344713 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344715 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344718 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344720 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344723 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344726 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344728 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344731 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344735 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344739 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344742 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344747 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344751 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344754 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344757 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344760 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344762 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344765 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344768 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:43.347814 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344770 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344773 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344775 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344778 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344781 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344783 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344786 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344789 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344791 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344794 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344797 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344800 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344802 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344805 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344807 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344810 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344812 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344815 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344817 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344820 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:43.348302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344822 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344825 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344828 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344831 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344833 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344837 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344839 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344842 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344844 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344847 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344849 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344853 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344855 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344858 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:43.344861 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.344866 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:43.348821 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.344961 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:15:43.349275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.348191 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:15:43.349275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.349113 2565 server.go:1019] "Starting client certificate rotation" Apr 28 19:15:43.349275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.349211 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:43.349275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.349247 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:43.371807 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.371786 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:43.374212 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.374192 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:43.385615 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.385485 2565 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:15:43.390888 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.390875 2565 log.go:25] "Validated CRI v1 image API" Apr 28 19:15:43.392122 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.392102 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:15:43.395652 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.395625 2565 fs.go:135] Filesystem UUIDs: map[2242ba5b-3396-4428-a185-42b4aec2e746:/dev/nvme0n1p4 6c9d5bad-61a6-4ec5-8e00-9f1f1357cd67:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 28 19:15:43.395724 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.395652 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:15:43.402297 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.402187 2565 manager.go:217] Machine: {Timestamp:2026-04-28 19:15:43.400532561 +0000 UTC m=+0.334202348 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099899 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2371a6fb4c8d1c6b061d06f6b2f860 SystemUUID:ec2371a6-fb4c-8d1c-6b06-1d06f6b2f860 BootID:e190c19d-d9c9-4e17-aca0-858f8b65b271 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fb:75:39:22:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fb:75:39:22:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:05:b7:2c:e6:f9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:15:43.402873 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.402863 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:15:43.402958 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.402947 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:15:43.405003 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.404958 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:15:43.405185 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.405005 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-121.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:15:43.405232 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.405196 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:15:43.405232 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.405205 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:15:43.405232 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.405217 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:43.406254 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.406244 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:43.407650 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.407640 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:43.407748 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.407740 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:15:43.409596 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.409581 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:43.409692 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.409682 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:15:43.409732 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.409697 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:15:43.409732 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.409712 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:15:43.409732 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.409721 2565 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:15:43.409732 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.409729 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:15:43.410732 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.410720 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:43.410776 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.410739 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:43.413145 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.413130 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:15:43.414942 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.414930 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:15:43.416104 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416091 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:15:43.416148 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416113 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:15:43.416148 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416124 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:15:43.416148 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416133 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:15:43.416148 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416142 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416151 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416160 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416170 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416180 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416189 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416201 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:15:43.416251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.416215 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:15:43.417074 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.417062 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:15:43.417074 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.417074 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:15:43.420569 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.420555 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:15:43.420645 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.420594 2565 server.go:1295] "Started kubelet" Apr 28 19:15:43.420726 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.420697 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:15:43.421209 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.421159 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:15:43.421285 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.421273 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:15:43.421398 ip-10-0-133-121 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:15:43.422588 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.422575 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:15:43.422878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.422862 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:15:43.426335 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.426308 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:15:43.426426 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.426343 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-121.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:15:43.426476 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.426418 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-121.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:15:43.427149 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.426296 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-121.ec2.internal.18aa9b499d084d59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-121.ec2.internal,UID:ip-10-0-133-121.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-121.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.420566873 +0000 UTC m=+0.354236658,LastTimestamp:2026-04-28 19:15:43.420566873 +0000 UTC m=+0.354236658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-121.ec2.internal,}" Apr 28 19:15:43.429654 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.429635 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:15:43.430290 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.430243 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:43.430713 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.430694 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:15:43.431431 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431413 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:15:43.431431 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.431420 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.431590 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431414 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:15:43.431590 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431447 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:15:43.431590 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431547 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:15:43.431590 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431556 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:15:43.431747 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431616 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:15:43.431747 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431630 2565 factory.go:55] Registering systemd factory Apr 28 19:15:43.431747 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431639 2565 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:15:43.431875 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431863 2565 factory.go:153] Registering CRI-O factory Apr 28 19:15:43.431912 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431878 2565 factory.go:223] Registration of the crio container factory successfully Apr 28 19:15:43.431912 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431898 2565 factory.go:103] Registering Raw factory Apr 28 19:15:43.431912 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.431907 2565 manager.go:1196] Started watching for new ooms in manager Apr 28 19:15:43.432285 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.432275 2565 manager.go:319] Starting recovery of all containers Apr 28 19:15:43.433915 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.433888 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:15:43.434014 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.433938 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-121.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:15:43.444764 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.444747 2565 manager.go:324] Recovery completed Apr 28 19:15:43.449058 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.449045 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.451400 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.451376 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.451456 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.451415 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.451456 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.451426 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.451907 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.451890 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:15:43.451907 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.451905 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:15:43.452018 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.451921 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:43.454654 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.454640 2565 policy_none.go:49] "None policy: Start" Apr 28 19:15:43.454704 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.454659 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:15:43.454704 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.454674 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:15:43.466266 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.466180 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-121.ec2.internal.18aa9b499edecea0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-121.ec2.internal,UID:ip-10-0-133-121.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-121.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-121.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.451401888 +0000 UTC m=+0.385071674,LastTimestamp:2026-04-28 19:15:43.451401888 +0000 UTC m=+0.385071674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-121.ec2.internal,}" Apr 28 19:15:43.477265 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.477172 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-121.ec2.internal.18aa9b499edf1550 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-121.ec2.internal,UID:ip-10-0-133-121.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-121.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-121.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.451419984 +0000 UTC m=+0.385089771,LastTimestamp:2026-04-28 19:15:43.451419984 +0000 UTC m=+0.385089771,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-121.ec2.internal,}" Apr 28 19:15:43.479933 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.479913 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9w4fn" Apr 28 19:15:43.486637 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.486579 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-121.ec2.internal.18aa9b499edf3f56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-121.ec2.internal,UID:ip-10-0-133-121.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-133-121.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-133-121.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.451430742 +0000 UTC m=+0.385100528,LastTimestamp:2026-04-28 19:15:43.451430742 +0000 UTC m=+0.385100528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-121.ec2.internal,}" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.488553 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9w4fn" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.490612 2565 manager.go:341] "Starting Device Plugin manager" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.490654 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.490667 2565 server.go:85] "Starting device plugin registration server" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.490883 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.490893 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.491012 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.491090 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.491099 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.491851 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:15:43.495068 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.491880 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.496602 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.496582 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:15:43.497785 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.497769 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:15:43.497852 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.497792 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:15:43.497852 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.497806 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:15:43.497852 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.497812 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:15:43.497852 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.497847 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:15:43.508131 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.508115 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:43.591297 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.591219 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.592149 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.592132 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.592218 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.592163 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.592218 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.592173 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.592218 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.592195 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.598930 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.598914 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal"] Apr 28 19:15:43.598998 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.598988 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.599685 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.599670 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.599769 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.599695 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.599769 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.599705 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.601076 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601052 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.601215 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601198 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.601256 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601230 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.601669 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601652 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.601743 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601678 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.601743 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601691 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.601743 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601720 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.601743 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601743 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.601920 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601757 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.601920 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.601826 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.601920 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.601843 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-121.ec2.internal\": node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.602866 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.602853 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.602919 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.602876 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.603507 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.603492 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.603589 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.603519 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.603589 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.603532 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.619607 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.619587 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.621505 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.621487 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-121.ec2.internal\" not found" node="ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.625695 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.625682 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-121.ec2.internal\" not found" node="ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.720287 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.720263 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.733601 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.733581 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b3119f12c26029aa487a8b6a06e517-config\") pod \"kube-apiserver-proxy-ip-10-0-133-121.ec2.internal\" (UID: \"57b3119f12c26029aa487a8b6a06e517\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.733660 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.733605 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fb7dcfac0e4064e23d87d8470c95d127-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal\" (UID: \"fb7dcfac0e4064e23d87d8470c95d127\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.733660 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.733623 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb7dcfac0e4064e23d87d8470c95d127-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal\" (UID: \"fb7dcfac0e4064e23d87d8470c95d127\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.820336 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.820305 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.834637 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.834618 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fb7dcfac0e4064e23d87d8470c95d127-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal\" (UID: \"fb7dcfac0e4064e23d87d8470c95d127\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.834708 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.834642 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb7dcfac0e4064e23d87d8470c95d127-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal\" (UID: \"fb7dcfac0e4064e23d87d8470c95d127\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.834708 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.834659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b3119f12c26029aa487a8b6a06e517-config\") pod \"kube-apiserver-proxy-ip-10-0-133-121.ec2.internal\" (UID: \"57b3119f12c26029aa487a8b6a06e517\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.834708 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.834691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b3119f12c26029aa487a8b6a06e517-config\") pod \"kube-apiserver-proxy-ip-10-0-133-121.ec2.internal\" (UID: \"57b3119f12c26029aa487a8b6a06e517\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.834857 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.834757 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fb7dcfac0e4064e23d87d8470c95d127-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal\" (UID: \"fb7dcfac0e4064e23d87d8470c95d127\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.834857 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.834828 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb7dcfac0e4064e23d87d8470c95d127-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal\" (UID: \"fb7dcfac0e4064e23d87d8470c95d127\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.921033 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:43.920944 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:43.924098 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.924084 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:43.927743 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:43.927725 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" Apr 28 19:15:44.021904 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.021868 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.122417 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.122390 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.222922 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.222857 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.323499 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.323470 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.348909 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.348884 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:15:44.349478 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.349059 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:15:44.404738 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:44.404696 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb7dcfac0e4064e23d87d8470c95d127.slice/crio-e2165efa89a94cf75bab6857d9c194309c1fb6bac715af3bf23b1d511c1d7cf3 WatchSource:0}: Error finding container e2165efa89a94cf75bab6857d9c194309c1fb6bac715af3bf23b1d511c1d7cf3: Status 404 returned error can't find the container with id e2165efa89a94cf75bab6857d9c194309c1fb6bac715af3bf23b1d511c1d7cf3 Apr 28 19:15:44.405194 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:44.405173 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b3119f12c26029aa487a8b6a06e517.slice/crio-21011d9fd22b827620136394e5adf0c74c3e063974a8a345d88b3e956a68353d WatchSource:0}: Error finding container 21011d9fd22b827620136394e5adf0c74c3e063974a8a345d88b3e956a68353d: Status 404 returned error can't find the container with id 21011d9fd22b827620136394e5adf0c74c3e063974a8a345d88b3e956a68353d Apr 28 19:15:44.410030 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.409971 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:15:44.424577 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.424555 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.430805 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.430790 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:44.457526 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.457505 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:44.474911 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.474855 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:44.489874 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.489844 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:10:43 +0000 UTC" deadline="2028-02-05 19:37:13.427935154 +0000 UTC" Apr 28 19:15:44.489874 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.489872 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15552h21m28.938065535s" Apr 28 19:15:44.500920 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.500873 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" event={"ID":"57b3119f12c26029aa487a8b6a06e517","Type":"ContainerStarted","Data":"21011d9fd22b827620136394e5adf0c74c3e063974a8a345d88b3e956a68353d"} Apr 28 19:15:44.501719 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.501700 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" event={"ID":"fb7dcfac0e4064e23d87d8470c95d127","Type":"ContainerStarted","Data":"e2165efa89a94cf75bab6857d9c194309c1fb6bac715af3bf23b1d511c1d7cf3"} Apr 28 19:15:44.523181 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.523165 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hjkqf" Apr 28 19:15:44.525255 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.525238 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.549587 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.549564 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hjkqf" Apr 28 19:15:44.625515 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.625490 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.726083 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.725970 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.751110 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.751087 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:44.826387 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:44.826360 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-121.ec2.internal\" not found" Apr 28 19:15:44.900505 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.900477 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:44.931346 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.931319 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" Apr 28 19:15:44.949779 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.949752 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:44.950664 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.950643 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" Apr 28 19:15:44.968204 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:44.966454 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:45.411132 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.410902 2565 apiserver.go:52] "Watching apiserver" Apr 28 19:15:45.423377 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.423353 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:15:45.425610 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.425583 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lqw5f","openshift-multus/network-metrics-daemon-nfhpq","openshift-network-operator/iptables-alerter-gsq8s","openshift-ovn-kubernetes/ovnkube-node-hkn59","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r","openshift-cluster-node-tuning-operator/tuned-b5pns","openshift-image-registry/node-ca-jn4w7","openshift-multus/multus-additional-cni-plugins-xt62h","openshift-network-diagnostics/network-check-target-zj9qs","kube-system/konnectivity-agent-4tvmc","kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal"] Apr 28 19:15:45.427033 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.427009 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.428084 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.428065 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:45.428169 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.428129 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:45.429197 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.429180 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.430214 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.430196 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.431235 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.431218 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.432351 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.432230 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:15:45.432351 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.432291 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.432351 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.432296 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.432351 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.432307 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:15:45.432601 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.432466 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-659j2\"" Apr 28 19:15:45.433581 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.433560 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.434616 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.434598 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.436127 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.436104 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:45.436597 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.436406 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.436678 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.436590 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:45.439530 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.439509 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.443712 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443694 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-cni-bin\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.443800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443717 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14dec99e-7998-4e82-bc5f-a1c596857848-iptables-alerter-script\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.443800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443733 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.443800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443748 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-sys-fs\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.443800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443766 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-lib-modules\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.443800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443787 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-var-lib-kubelet\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443858 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-netns\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443898 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzkss\" (UniqueName: \"kubernetes.io/projected/a559869f-dc8c-4397-aa54-b59c274faa74-kube-api-access-dzkss\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-modprobe-d\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysctl-d\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.443972 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0b82110c-7495-4dba-b0fc-b29ca1b890f4-host\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0b82110c-7495-4dba-b0fc-b29ca1b890f4-serviceca\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444047 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-system-cni-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444069 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-cni-multus\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14dec99e-7998-4e82-bc5f-a1c596857848-host-slash\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444111 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444131 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-run\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444152 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-host\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444167 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-tuned\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444188 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-cni-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444216 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv9kc\" (UniqueName: \"kubernetes.io/projected/14dec99e-7998-4e82-bc5f-a1c596857848-kube-api-access-gv9kc\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444230 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-device-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444250 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhst\" (UniqueName: \"kubernetes.io/projected/16167a05-2728-4593-8476-13a52840c7fd-kube-api-access-tkhst\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444269 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5fbf\" (UniqueName: \"kubernetes.io/projected/36064ec9-9a98-40d9-8175-bc4e60e23db1-kube-api-access-g5fbf\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2sq\" (UniqueName: \"kubernetes.io/projected/0b82110c-7495-4dba-b0fc-b29ca1b890f4-kube-api-access-lw2sq\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444299 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-cnibin\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444320 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-os-release\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444342 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-k8s-cni-cncf-io\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444355 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-kubelet\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.444443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-hostroot\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444382 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-daemon-config\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444407 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-multus-certs\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444429 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-etc-kubernetes\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-socket-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444476 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-registration-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444505 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-socket-dir-parent\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444530 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-conf-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444548 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysctl-conf\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444562 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-systemd\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444587 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bf7\" (UniqueName: \"kubernetes.io/projected/83a284ae-2839-4ef8-a791-9c32c55d6694-kube-api-access-78bf7\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444608 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysconfig\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444622 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-sys\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444636 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/83a284ae-2839-4ef8-a791-9c32c55d6694-cni-binary-copy\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444649 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-kubernetes\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36064ec9-9a98-40d9-8175-bc4e60e23db1-tmp\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444899 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.444969 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.445000 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.445249 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.445205 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.446149 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.445909 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:15:45.446149 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446123 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:15:45.446149 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446134 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:15:45.446306 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446289 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.446471 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446438 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.446536 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446498 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:15:45.446536 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446498 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.446660 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446635 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:15:45.446729 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446635 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:15:45.446943 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.446925 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:15:45.447360 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447149 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:15:45.447360 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447231 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.447360 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447310 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.447557 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447508 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fxtrj\"" Apr 28 19:15:45.447557 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447519 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-k9gmt\"" Apr 28 19:15:45.447557 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447536 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8x9t6\"" Apr 28 19:15:45.447697 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447571 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jhqhr\"" Apr 28 19:15:45.447697 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447638 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.447697 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447518 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:15:45.447847 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447825 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jzmcr\"" Apr 28 19:15:45.447903 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447853 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-w4mjc\"" Apr 28 19:15:45.447903 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.447867 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zj9wk\"" Apr 28 19:15:45.448052 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.448034 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:15:45.448137 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.448126 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:15:45.532404 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.532379 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:15:45.545075 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysconfig\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545075 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545079 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-sys\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545103 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545125 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/83a284ae-2839-4ef8-a791-9c32c55d6694-cni-binary-copy\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545148 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-kubernetes\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545172 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545183 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-sys\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545196 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-sys-fs\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545205 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-kubernetes\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545231 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdjz\" (UniqueName: \"kubernetes.io/projected/0c174017-a3dc-4241-8008-c41fd1ae8cec-kube-api-access-rtdjz\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545257 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-etc-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545274 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-sys-fs\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545275 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.545293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545286 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysconfig\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545333 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-ovnkube-script-lib\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545375 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-netns\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545405 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzkss\" (UniqueName: \"kubernetes.io/projected/a559869f-dc8c-4397-aa54-b59c274faa74-kube-api-access-dzkss\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545418 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-netns\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-modprobe-d\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545463 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysctl-d\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545486 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0b82110c-7495-4dba-b0fc-b29ca1b890f4-serviceca\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545512 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-run-netns\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545560 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-modprobe-d\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545537 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5wf\" (UniqueName: \"kubernetes.io/projected/f0ca223d-22df-4d91-a877-7adbc2efde17-kube-api-access-4d5wf\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545640 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysctl-d\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545678 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545721 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-cni-binary-copy\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545751 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545781 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-system-cni-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.545913 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545811 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-cni-multus\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545854 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-system-cni-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545898 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14dec99e-7998-4e82-bc5f-a1c596857848-host-slash\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545908 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-cni-multus\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545945 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.545990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-host\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546002 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14dec99e-7998-4e82-bc5f-a1c596857848-host-slash\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546021 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-tuned\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546037 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-host\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546049 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-cni-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546079 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-device-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546095 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-cni-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhst\" (UniqueName: \"kubernetes.io/projected/16167a05-2728-4593-8476-13a52840c7fd-kube-api-access-tkhst\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546142 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-device-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546128 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-log-socket\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/83a284ae-2839-4ef8-a791-9c32c55d6694-cni-binary-copy\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.546739 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546303 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-env-overrides\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546342 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0ca223d-22df-4d91-a877-7adbc2efde17-ovn-node-metrics-cert\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-cnibin\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-os-release\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546421 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-k8s-cni-cncf-io\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546455 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-kubelet\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546494 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-k8s-cni-cncf-io\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546483 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-daemon-config\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546537 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-multus-certs\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546891 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0b82110c-7495-4dba-b0fc-b29ca1b890f4-serviceca\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546562 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysctl-conf\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546957 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-os-release\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547005 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-os-release\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547035 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-socket-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547053 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-sysctl-conf\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546456 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-cnibin\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547070 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-registration-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.547466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547119 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-registration-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547118 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-system-cni-dir\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547161 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b1a47c9c-0b33-44f3-8e5a-5f69cab573b7-agent-certs\") pod \"konnectivity-agent-4tvmc\" (UID: \"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7\") " pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547193 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-cni-netd\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547229 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78bf7\" (UniqueName: \"kubernetes.io/projected/83a284ae-2839-4ef8-a791-9c32c55d6694-kube-api-access-78bf7\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547251 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16167a05-2728-4593-8476-13a52840c7fd-socket-dir\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547263 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-var-lib-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547292 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-daemon-config\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547315 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-kubelet\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547360 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36064ec9-9a98-40d9-8175-bc4e60e23db1-tmp\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547545 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-ovn\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547590 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-cni-bin\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-cni-bin\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547675 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14dec99e-7998-4e82-bc5f-a1c596857848-iptables-alerter-script\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-var-lib-cni-bin\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-lib-modules\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547730 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:15:45.548193 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.547829 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-var-lib-kubelet\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.546608 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-host-run-multus-certs\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548087 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-systemd-units\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548189 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-lib-modules\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548224 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14dec99e-7998-4e82-bc5f-a1c596857848-iptables-alerter-script\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548250 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-var-lib-kubelet\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548296 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-node-log\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0b82110c-7495-4dba-b0fc-b29ca1b890f4-host\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548402 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0b82110c-7495-4dba-b0fc-b29ca1b890f4-host\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548465 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-run\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548499 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-cnibin\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548566 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-run\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548609 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv9kc\" (UniqueName: \"kubernetes.io/projected/14dec99e-7998-4e82-bc5f-a1c596857848-kube-api-access-gv9kc\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548672 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5fbf\" (UniqueName: \"kubernetes.io/projected/36064ec9-9a98-40d9-8175-bc4e60e23db1-kube-api-access-g5fbf\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548697 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2sq\" (UniqueName: \"kubernetes.io/projected/0b82110c-7495-4dba-b0fc-b29ca1b890f4-kube-api-access-lw2sq\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.548941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548730 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b1a47c9c-0b33-44f3-8e5a-5f69cab573b7-konnectivity-ca\") pod \"konnectivity-agent-4tvmc\" (UID: \"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7\") " pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-hostroot\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-etc-kubernetes\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548820 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-kubelet\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548849 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-run-ovn-kubernetes\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548874 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-ovnkube-config\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548903 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-socket-dir-parent\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548931 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-conf-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.548962 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-systemd\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.549020 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-slash\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.549052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-systemd\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.549198 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:45.549681 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.549284 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:46.049248367 +0000 UTC m=+2.982918162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:45.550236 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-hostroot\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.550236 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550087 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-etc-kubernetes\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.550236 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550178 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:44 +0000 UTC" deadline="2027-10-09 01:00:38.779929355 +0000 UTC" Apr 28 19:15:45.550236 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550195 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12677h44m53.229736808s" Apr 28 19:15:45.550236 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550209 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-systemd\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.550473 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550275 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-socket-dir-parent\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.550473 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.550320 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/83a284ae-2839-4ef8-a791-9c32c55d6694-multus-conf-dir\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.551643 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.551618 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36064ec9-9a98-40d9-8175-bc4e60e23db1-tmp\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.551643 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.551632 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/36064ec9-9a98-40d9-8175-bc4e60e23db1-etc-tuned\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.556648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.556623 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzkss\" (UniqueName: \"kubernetes.io/projected/a559869f-dc8c-4397-aa54-b59c274faa74-kube-api-access-dzkss\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:45.560158 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.560135 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bf7\" (UniqueName: \"kubernetes.io/projected/83a284ae-2839-4ef8-a791-9c32c55d6694-kube-api-access-78bf7\") pod \"multus-lqw5f\" (UID: \"83a284ae-2839-4ef8-a791-9c32c55d6694\") " pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.560259 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.560243 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhst\" (UniqueName: \"kubernetes.io/projected/16167a05-2728-4593-8476-13a52840c7fd-kube-api-access-tkhst\") pod \"aws-ebs-csi-driver-node-qr59r\" (UID: \"16167a05-2728-4593-8476-13a52840c7fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.567382 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.567356 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2sq\" (UniqueName: \"kubernetes.io/projected/0b82110c-7495-4dba-b0fc-b29ca1b890f4-kube-api-access-lw2sq\") pod \"node-ca-jn4w7\" (UID: \"0b82110c-7495-4dba-b0fc-b29ca1b890f4\") " pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.568174 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.568154 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5fbf\" (UniqueName: \"kubernetes.io/projected/36064ec9-9a98-40d9-8175-bc4e60e23db1-kube-api-access-g5fbf\") pod \"tuned-b5pns\" (UID: \"36064ec9-9a98-40d9-8175-bc4e60e23db1\") " pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.568662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.568640 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv9kc\" (UniqueName: \"kubernetes.io/projected/14dec99e-7998-4e82-bc5f-a1c596857848-kube-api-access-gv9kc\") pod \"iptables-alerter-gsq8s\" (UID: \"14dec99e-7998-4e82-bc5f-a1c596857848\") " pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.649443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649412 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-slash\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649446 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-systemd\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649464 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649483 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649522 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdjz\" (UniqueName: \"kubernetes.io/projected/0c174017-a3dc-4241-8008-c41fd1ae8cec-kube-api-access-rtdjz\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-systemd\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649550 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-etc-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649560 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-slash\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649623 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-ovnkube-script-lib\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649645 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-run-netns\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.649675 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5wf\" (UniqueName: \"kubernetes.io/projected/f0ca223d-22df-4d91-a877-7adbc2efde17-kube-api-access-4d5wf\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649688 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649741 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-cni-binary-copy\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649778 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649817 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-etc-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649821 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-log-socket\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649854 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-log-socket\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-env-overrides\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649887 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0ca223d-22df-4d91-a877-7adbc2efde17-ovn-node-metrics-cert\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649921 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-os-release\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649949 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-system-cni-dir\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.649973 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b1a47c9c-0b33-44f3-8e5a-5f69cab573b7-agent-certs\") pod \"konnectivity-agent-4tvmc\" (UID: \"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7\") " pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650017 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-cni-netd\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650043 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-var-lib-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650064 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650066 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650070 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-ovn\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650105 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-run-ovn\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-run-netns\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-cni-bin\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-systemd-units\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-node-log\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650233 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-cni-bin\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-system-cni-dir\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650248 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650284 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-cnibin\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650329 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b1a47c9c-0b33-44f3-8e5a-5f69cab573b7-konnectivity-ca\") pod \"konnectivity-agent-4tvmc\" (UID: \"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7\") " pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650353 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c174017-a3dc-4241-8008-c41fd1ae8cec-cni-binary-copy\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-kubelet\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650363 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-cnibin\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650398 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-run-ovn-kubernetes\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650402 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-kubelet\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650283 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-cni-netd\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.650878 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650426 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-ovnkube-config\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650456 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-systemd-units\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650475 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650315 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-var-lib-openvswitch\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650500 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-host-run-ovn-kubernetes\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650188 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c174017-a3dc-4241-8008-c41fd1ae8cec-os-release\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650533 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0ca223d-22df-4d91-a877-7adbc2efde17-node-log\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650724 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-ovnkube-script-lib\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.650863 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-ovnkube-config\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.651379 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0ca223d-22df-4d91-a877-7adbc2efde17-env-overrides\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.651445 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.651391 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b1a47c9c-0b33-44f3-8e5a-5f69cab573b7-konnectivity-ca\") pod \"konnectivity-agent-4tvmc\" (UID: \"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7\") " pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.652442 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.652418 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b1a47c9c-0b33-44f3-8e5a-5f69cab573b7-agent-certs\") pod \"konnectivity-agent-4tvmc\" (UID: \"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7\") " pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.652672 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.652652 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0ca223d-22df-4d91-a877-7adbc2efde17-ovn-node-metrics-cert\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.660108 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.660088 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:45.660108 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.660109 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:45.660267 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.660123 2565 projected.go:194] Error preparing data for projected volume kube-api-access-98gww for pod openshift-network-diagnostics/network-check-target-zj9qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:45.660267 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:45.660188 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww podName:d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf nodeName:}" failed. No retries permitted until 2026-04-28 19:15:46.160172037 +0000 UTC m=+3.093841830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-98gww" (UniqueName: "kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww") pod "network-check-target-zj9qs" (UID: "d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:45.662255 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.662174 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdjz\" (UniqueName: \"kubernetes.io/projected/0c174017-a3dc-4241-8008-c41fd1ae8cec-kube-api-access-rtdjz\") pod \"multus-additional-cni-plugins-xt62h\" (UID: \"0c174017-a3dc-4241-8008-c41fd1ae8cec\") " pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.662411 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.662395 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5wf\" (UniqueName: \"kubernetes.io/projected/f0ca223d-22df-4d91-a877-7adbc2efde17-kube-api-access-4d5wf\") pod \"ovnkube-node-hkn59\" (UID: \"f0ca223d-22df-4d91-a877-7adbc2efde17\") " pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.740783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.740757 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lqw5f" Apr 28 19:15:45.747463 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.747438 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gsq8s" Apr 28 19:15:45.755162 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.755132 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" Apr 28 19:15:45.760714 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.760695 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b5pns" Apr 28 19:15:45.766278 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.766262 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jn4w7" Apr 28 19:15:45.771757 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.771738 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xt62h" Apr 28 19:15:45.779312 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.779292 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:15:45.783904 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.783887 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:15:45.898057 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:45.898017 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:45.969676 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:45.969647 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c174017_a3dc_4241_8008_c41fd1ae8cec.slice/crio-d262ba9d481d4d45438304a95bd9576319ec75cf7ca942418ca73a562841ff05 WatchSource:0}: Error finding container d262ba9d481d4d45438304a95bd9576319ec75cf7ca942418ca73a562841ff05: Status 404 returned error can't find the container with id d262ba9d481d4d45438304a95bd9576319ec75cf7ca942418ca73a562841ff05 Apr 28 19:15:45.971137 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:45.971108 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36064ec9_9a98_40d9_8175_bc4e60e23db1.slice/crio-3b4e3270bc4dab9dd567f1aa4cd8690d06eb9738caa45dfb906faae855cca2e0 WatchSource:0}: Error finding container 3b4e3270bc4dab9dd567f1aa4cd8690d06eb9738caa45dfb906faae855cca2e0: Status 404 returned error can't find the container with id 3b4e3270bc4dab9dd567f1aa4cd8690d06eb9738caa45dfb906faae855cca2e0 Apr 28 19:15:45.972118 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:45.972098 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14dec99e_7998_4e82_bc5f_a1c596857848.slice/crio-3cea1097104296b73dabae417c3c621ba54269f11bc60796863d8dfde75adc82 WatchSource:0}: Error finding container 3cea1097104296b73dabae417c3c621ba54269f11bc60796863d8dfde75adc82: Status 404 returned error can't find the container with id 3cea1097104296b73dabae417c3c621ba54269f11bc60796863d8dfde75adc82 Apr 28 19:15:45.973089 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:45.973039 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a284ae_2839_4ef8_a791_9c32c55d6694.slice/crio-1aba58e7be11c4ffcee015ab5ad731450fa8a444584a54b7e48740c537d01ed3 WatchSource:0}: Error finding container 1aba58e7be11c4ffcee015ab5ad731450fa8a444584a54b7e48740c537d01ed3: Status 404 returned error can't find the container with id 1aba58e7be11c4ffcee015ab5ad731450fa8a444584a54b7e48740c537d01ed3 Apr 28 19:15:45.974542 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:15:45.974249 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ca223d_22df_4d91_a877_7adbc2efde17.slice/crio-d56b83c03c5f4146f05034a11106a7e2138bfd6797a7baf74410a2cc833eb015 WatchSource:0}: Error finding container d56b83c03c5f4146f05034a11106a7e2138bfd6797a7baf74410a2cc833eb015: Status 404 returned error can't find the container with id d56b83c03c5f4146f05034a11106a7e2138bfd6797a7baf74410a2cc833eb015 Apr 28 19:15:46.053850 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.053827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:46.053953 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:46.053935 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:46.054017 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:46.054006 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:47.053988913 +0000 UTC m=+3.987658706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:46.255511 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.255435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:46.255636 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:46.255588 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:46.255636 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:46.255608 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:46.255636 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:46.255618 2565 projected.go:194] Error preparing data for projected volume kube-api-access-98gww for pod openshift-network-diagnostics/network-check-target-zj9qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:46.255742 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:46.255665 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww podName:d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf nodeName:}" failed. No retries permitted until 2026-04-28 19:15:47.255651988 +0000 UTC m=+4.189321766 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-98gww" (UniqueName: "kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww") pod "network-check-target-zj9qs" (UID: "d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:46.512064 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.511971 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" event={"ID":"57b3119f12c26029aa487a8b6a06e517","Type":"ContainerStarted","Data":"372f416418dc8a3ec6a55addd9721bb14bbd8c464e13de28e09136a10369c94e"} Apr 28 19:15:46.514012 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.513941 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" event={"ID":"16167a05-2728-4593-8476-13a52840c7fd","Type":"ContainerStarted","Data":"5395d856b46258083cc326babe45a7f406a86f880912dc865efdfa5b20c301ac"} Apr 28 19:15:46.515334 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.515272 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"d56b83c03c5f4146f05034a11106a7e2138bfd6797a7baf74410a2cc833eb015"} Apr 28 19:15:46.517101 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.517034 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lqw5f" event={"ID":"83a284ae-2839-4ef8-a791-9c32c55d6694","Type":"ContainerStarted","Data":"1aba58e7be11c4ffcee015ab5ad731450fa8a444584a54b7e48740c537d01ed3"} Apr 28 19:15:46.521075 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.520936 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b5pns" event={"ID":"36064ec9-9a98-40d9-8175-bc4e60e23db1","Type":"ContainerStarted","Data":"3b4e3270bc4dab9dd567f1aa4cd8690d06eb9738caa45dfb906faae855cca2e0"} Apr 28 19:15:46.533130 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.533054 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jn4w7" event={"ID":"0b82110c-7495-4dba-b0fc-b29ca1b890f4","Type":"ContainerStarted","Data":"7c5d71ae187efcc538ebca37af904eed097c793f9958d81b2f33fc19dd8b9f9c"} Apr 28 19:15:46.539814 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.539787 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4tvmc" event={"ID":"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7","Type":"ContainerStarted","Data":"f663520ef10cb6f587dbc6bca104201d4bc15819c32299ec3de8ad5abbf77ef0"} Apr 28 19:15:46.546538 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.546511 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gsq8s" event={"ID":"14dec99e-7998-4e82-bc5f-a1c596857848","Type":"ContainerStarted","Data":"3cea1097104296b73dabae417c3c621ba54269f11bc60796863d8dfde75adc82"} Apr 28 19:15:46.550477 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.550449 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:44 +0000 UTC" deadline="2027-10-11 19:30:36.131218184 +0000 UTC" Apr 28 19:15:46.550583 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.550478 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12744h14m49.580743169s" Apr 28 19:15:46.553960 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:46.553866 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerStarted","Data":"d262ba9d481d4d45438304a95bd9576319ec75cf7ca942418ca73a562841ff05"} Apr 28 19:15:47.062170 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.062053 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:47.062331 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.062198 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:47.062331 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.062260 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:49.062241652 +0000 UTC m=+5.995911449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:47.263711 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.263635 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:47.263908 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.263804 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:47.263908 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.263822 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:47.263908 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.263835 2565 projected.go:194] Error preparing data for projected volume kube-api-access-98gww for pod openshift-network-diagnostics/network-check-target-zj9qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:47.263908 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.263891 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww podName:d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf nodeName:}" failed. No retries permitted until 2026-04-28 19:15:49.263873346 +0000 UTC m=+6.197543134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-98gww" (UniqueName: "kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww") pod "network-check-target-zj9qs" (UID: "d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:47.501293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.501223 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:47.501441 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.501346 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:47.501765 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.501745 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:47.501878 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:47.501854 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:47.561840 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.561756 2565 generic.go:358] "Generic (PLEG): container finished" podID="fb7dcfac0e4064e23d87d8470c95d127" containerID="3c6780d5addaa30b151da161e3915b491b0a1deadfcf6c93b6f4b1363baeefe3" exitCode=0 Apr 28 19:15:47.562668 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.562644 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" event={"ID":"fb7dcfac0e4064e23d87d8470c95d127","Type":"ContainerDied","Data":"3c6780d5addaa30b151da161e3915b491b0a1deadfcf6c93b6f4b1363baeefe3"} Apr 28 19:15:47.581691 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:47.581622 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-121.ec2.internal" podStartSLOduration=3.581604686 podStartE2EDuration="3.581604686s" podCreationTimestamp="2026-04-28 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:15:46.532430959 +0000 UTC m=+3.466100756" watchObservedRunningTime="2026-04-28 19:15:47.581604686 +0000 UTC m=+4.515274483" Apr 28 19:15:48.568320 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:48.568235 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" event={"ID":"fb7dcfac0e4064e23d87d8470c95d127","Type":"ContainerStarted","Data":"b803a23227907e0043795e0e802fece5d8d30d01057f783ecf9f3a7fd433ef1d"} Apr 28 19:15:48.593356 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:48.593298 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-121.ec2.internal" podStartSLOduration=4.593278816 podStartE2EDuration="4.593278816s" podCreationTimestamp="2026-04-28 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:15:48.592963388 +0000 UTC m=+5.526633198" watchObservedRunningTime="2026-04-28 19:15:48.593278816 +0000 UTC m=+5.526948612" Apr 28 19:15:49.078904 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:49.078874 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:49.079081 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.079039 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:49.079142 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.079099 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:53.079080302 +0000 UTC m=+10.012750078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:49.280202 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:49.280157 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:49.280364 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.280330 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:49.280464 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.280385 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:49.280464 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.280442 2565 projected.go:194] Error preparing data for projected volume kube-api-access-98gww for pod openshift-network-diagnostics/network-check-target-zj9qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:49.280574 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.280500 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww podName:d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf nodeName:}" failed. No retries permitted until 2026-04-28 19:15:53.280481633 +0000 UTC m=+10.214151407 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-98gww" (UniqueName: "kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww") pod "network-check-target-zj9qs" (UID: "d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:49.499207 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:49.499128 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:49.499364 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.499267 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:49.500632 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:49.500499 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:49.500632 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:49.500596 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:51.499461 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:51.498969 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:51.499461 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:51.499122 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:51.499461 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:51.499407 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:51.500027 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:51.499606 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:53.110121 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:53.110083 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:53.110560 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.110213 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:53.110560 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.110300 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:01.110278535 +0000 UTC m=+18.043948309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:53.310998 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:53.310913 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:53.311183 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.311061 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:53.311183 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.311082 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:53.311183 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.311094 2565 projected.go:194] Error preparing data for projected volume kube-api-access-98gww for pod openshift-network-diagnostics/network-check-target-zj9qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:53.311183 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.311152 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww podName:d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf nodeName:}" failed. No retries permitted until 2026-04-28 19:16:01.311134509 +0000 UTC m=+18.244804286 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-98gww" (UniqueName: "kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww") pod "network-check-target-zj9qs" (UID: "d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:53.501418 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:53.500281 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:53.501418 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.500406 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:53.501418 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:53.500423 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:53.501418 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:53.500505 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:55.498959 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:55.498920 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:55.498959 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:55.498948 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:55.499419 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:55.499057 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:55.499419 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:55.499229 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:57.498557 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:57.498332 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:57.499009 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:57.498390 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:57.499009 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:57.498676 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:57.499009 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:57.498727 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:15:59.498627 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:59.498584 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:15:59.499057 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:15:59.498796 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:15:59.499057 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:59.498796 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:15:59.499057 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:15:59.498916 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:01.165865 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:01.165833 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:01.166350 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.165949 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:01.166350 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.166014 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.166000413 +0000 UTC m=+34.099670186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:01.368363 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:01.368328 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:01.368532 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.368503 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:01.368532 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.368526 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:01.368615 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.368539 2565 projected.go:194] Error preparing data for projected volume kube-api-access-98gww for pod openshift-network-diagnostics/network-check-target-zj9qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:01.368615 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.368602 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww podName:d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.368583448 +0000 UTC m=+34.302253254 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-98gww" (UniqueName: "kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww") pod "network-check-target-zj9qs" (UID: "d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:01.498191 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:01.498108 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:01.498350 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:01.498120 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:01.498350 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.498221 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:01.498350 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:01.498316 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:02.602689 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:02.602653 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b5pns" event={"ID":"36064ec9-9a98-40d9-8175-bc4e60e23db1","Type":"ContainerStarted","Data":"8aba8cbe1b4ac1a96295d4b0f548a56c5094ca57efe36740a19c5ba3c30f21cd"} Apr 28 19:16:02.635232 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:02.635179 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b5pns" podStartSLOduration=3.197860531 podStartE2EDuration="19.635161088s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.973301554 +0000 UTC m=+2.906971337" lastFinishedPulling="2026-04-28 19:16:02.410602115 +0000 UTC m=+19.344271894" observedRunningTime="2026-04-28 19:16:02.633720768 +0000 UTC m=+19.567390557" watchObservedRunningTime="2026-04-28 19:16:02.635161088 +0000 UTC m=+19.568830884" Apr 28 19:16:03.499035 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.498623 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:03.499274 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.498684 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:03.499274 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:03.499180 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:03.499274 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:03.499189 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:03.503924 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.503904 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:03.607107 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.607063 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lqw5f" event={"ID":"83a284ae-2839-4ef8-a791-9c32c55d6694","Type":"ContainerStarted","Data":"27afde97f877de513c2e20ac6800488b95df745ff2de7a697adca04bf418f8a2"} Apr 28 19:16:03.608572 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.608552 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jn4w7" event={"ID":"0b82110c-7495-4dba-b0fc-b29ca1b890f4","Type":"ContainerStarted","Data":"a6b3f161a8bf6f355ca7e795bcde4cdd80ef2580af0c16df66b6f9e17ebd59c2"} Apr 28 19:16:03.609758 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.609742 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4tvmc" event={"ID":"b1a47c9c-0b33-44f3-8e5a-5f69cab573b7","Type":"ContainerStarted","Data":"92a0691bc8721704e6c132d27b8fe3099e252d81496b4850f0e0525f9c248c3e"} Apr 28 19:16:03.613596 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.613574 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c174017-a3dc-4241-8008-c41fd1ae8cec" containerID="1ac9031817ed04bbd83cebb83b1c9e6d67b4e5a11d1d00d5edc67e1bd8d46416" exitCode=0 Apr 28 19:16:03.613694 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.613639 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerDied","Data":"1ac9031817ed04bbd83cebb83b1c9e6d67b4e5a11d1d00d5edc67e1bd8d46416"} Apr 28 19:16:03.615127 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.615106 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" event={"ID":"16167a05-2728-4593-8476-13a52840c7fd","Type":"ContainerStarted","Data":"e6c2fb265ae8ae37e2e797350205c4ffecc074f92f27e2761c6e1afc4a7a4e80"} Apr 28 19:16:03.615197 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.615133 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" event={"ID":"16167a05-2728-4593-8476-13a52840c7fd","Type":"ContainerStarted","Data":"1abdf7e2d8d6f3bb5fbe690f608a9d261d2d7fe664b4ddb25100542bd967a652"} Apr 28 19:16:03.617466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.617443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"0e5f7c5fc956e86180793530e0e4f9157a6633c5bb5f9502d0ee1a90e8cafa3a"} Apr 28 19:16:03.617528 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.617473 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"720ad82fd8f8973809d3b5a5e816a4309c626378829430857350f918bd36719f"} Apr 28 19:16:03.617528 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.617487 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"00fdc621226b074e2bd81988e2cb036fc9bac91dab90c9cfbb5620e52b4eddb2"} Apr 28 19:16:03.617528 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.617499 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"8cb709b40e645dc02601e95aa3698cd95e69a9d61146cabb8666b9f9076f6e00"} Apr 28 19:16:03.617528 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.617512 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"1ed7a37a295e831e62b88eb60890f855e23a25ed5d3bcd17ee2683a4be5789cf"} Apr 28 19:16:03.617528 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.617523 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"2eb48159636f123abb55e34d9b8d2e3c8769079d3de45ebf1a0e787f34446243"} Apr 28 19:16:03.625130 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.625088 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lqw5f" podStartSLOduration=4.034408398 podStartE2EDuration="20.625075084s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.975025965 +0000 UTC m=+2.908695739" lastFinishedPulling="2026-04-28 19:16:02.565692637 +0000 UTC m=+19.499362425" observedRunningTime="2026-04-28 19:16:03.624715431 +0000 UTC m=+20.558385222" watchObservedRunningTime="2026-04-28 19:16:03.625075084 +0000 UTC m=+20.558744881" Apr 28 19:16:03.637398 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:03.637360 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jn4w7" podStartSLOduration=4.3635846130000004 podStartE2EDuration="20.637350152s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.981339477 +0000 UTC m=+2.915009255" lastFinishedPulling="2026-04-28 19:16:02.255105021 +0000 UTC m=+19.188774794" observedRunningTime="2026-04-28 19:16:03.63693948 +0000 UTC m=+20.570609276" watchObservedRunningTime="2026-04-28 19:16:03.637350152 +0000 UTC m=+20.571019947" Apr 28 19:16:04.503927 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.503739 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:03.503919984Z","UUID":"dafcaab0-c6e6-4696-9d57-74c5a1fa9774","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:04.506422 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.506389 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:04.506422 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.506422 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:04.620562 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.620524 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" event={"ID":"16167a05-2728-4593-8476-13a52840c7fd","Type":"ContainerStarted","Data":"1da280eb541a142212cb02c62132953e001538bd4c3271ef04f17f3132b557fe"} Apr 28 19:16:04.622032 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.622001 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gsq8s" event={"ID":"14dec99e-7998-4e82-bc5f-a1c596857848","Type":"ContainerStarted","Data":"d7a20c089e5dc5afd0b7911aca2fe0f3376399d6794e2b72fa9df3ec6130cadb"} Apr 28 19:16:04.638987 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.638944 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4tvmc" podStartSLOduration=5.208431687 podStartE2EDuration="21.638927728s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.980083456 +0000 UTC m=+2.913753230" lastFinishedPulling="2026-04-28 19:16:02.410579493 +0000 UTC m=+19.344249271" observedRunningTime="2026-04-28 19:16:03.671353554 +0000 UTC m=+20.605023361" watchObservedRunningTime="2026-04-28 19:16:04.638927728 +0000 UTC m=+21.572597525" Apr 28 19:16:04.639468 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.639433 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qr59r" podStartSLOduration=3.375600279 podStartE2EDuration="21.639424312s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.982707749 +0000 UTC m=+2.916377522" lastFinishedPulling="2026-04-28 19:16:04.246531767 +0000 UTC m=+21.180201555" observedRunningTime="2026-04-28 19:16:04.639062611 +0000 UTC m=+21.572732388" watchObservedRunningTime="2026-04-28 19:16:04.639424312 +0000 UTC m=+21.573094108" Apr 28 19:16:04.655615 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:04.655576 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gsq8s" podStartSLOduration=5.219847287 podStartE2EDuration="21.655564175s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.974893254 +0000 UTC m=+2.908563033" lastFinishedPulling="2026-04-28 19:16:02.410610144 +0000 UTC m=+19.344279921" observedRunningTime="2026-04-28 19:16:04.655192904 +0000 UTC m=+21.588862698" watchObservedRunningTime="2026-04-28 19:16:04.655564175 +0000 UTC m=+21.589233972" Apr 28 19:16:05.498867 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:05.498829 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:05.498867 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:05.498852 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:05.499189 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:05.498974 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:05.499189 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:05.499083 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:05.627576 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:05.627530 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"7fcc78d31b4ec13cc565743f0e4bc4bd5c1177a6da10cfd52c5fece3e57449b3"} Apr 28 19:16:06.039636 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:06.039602 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:16:06.040322 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:06.040179 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:16:06.629639 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:06.629579 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:16:06.630241 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:06.630173 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4tvmc" Apr 28 19:16:07.181028 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.180991 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9vshb"] Apr 28 19:16:07.186840 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.186820 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.189442 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.189424 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:07.189873 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.189856 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:07.190688 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.190669 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4z7ww\"" Apr 28 19:16:07.313258 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.313227 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-hosts-file\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.313258 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.313264 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs949\" (UniqueName: \"kubernetes.io/projected/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-kube-api-access-bs949\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.313468 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.313285 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-tmp-dir\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.413735 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.413661 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs949\" (UniqueName: \"kubernetes.io/projected/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-kube-api-access-bs949\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.413735 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.413701 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-tmp-dir\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.413941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.413779 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-hosts-file\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.413941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.413852 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-hosts-file\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.414091 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.414074 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-tmp-dir\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.424521 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.424348 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs949\" (UniqueName: \"kubernetes.io/projected/a521c111-aa4b-4eda-b86b-7b9f76fcd75f-kube-api-access-bs949\") pod \"node-resolver-9vshb\" (UID: \"a521c111-aa4b-4eda-b86b-7b9f76fcd75f\") " pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.496893 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.496860 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9vshb" Apr 28 19:16:07.498620 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.498599 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:07.498733 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:07.498711 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:07.498799 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.498778 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:07.498991 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:07.498959 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:07.553163 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:07.553136 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda521c111_aa4b_4eda_b86b_7b9f76fcd75f.slice/crio-e361ef74a6ea5c76e65f6576d532269e8595a7727b3e95a8ce156a551b7313d8 WatchSource:0}: Error finding container e361ef74a6ea5c76e65f6576d532269e8595a7727b3e95a8ce156a551b7313d8: Status 404 returned error can't find the container with id e361ef74a6ea5c76e65f6576d532269e8595a7727b3e95a8ce156a551b7313d8 Apr 28 19:16:07.633107 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:07.633077 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9vshb" event={"ID":"a521c111-aa4b-4eda-b86b-7b9f76fcd75f","Type":"ContainerStarted","Data":"e361ef74a6ea5c76e65f6576d532269e8595a7727b3e95a8ce156a551b7313d8"} Apr 28 19:16:08.636377 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.636344 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c174017-a3dc-4241-8008-c41fd1ae8cec" containerID="18eb93f946d4b32495802a2bc132eb6e0a90251e9c260f85292af76b82b076db" exitCode=0 Apr 28 19:16:08.636838 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.636433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerDied","Data":"18eb93f946d4b32495802a2bc132eb6e0a90251e9c260f85292af76b82b076db"} Apr 28 19:16:08.641783 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.641590 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" event={"ID":"f0ca223d-22df-4d91-a877-7adbc2efde17","Type":"ContainerStarted","Data":"713205dcb111fb45a98ef8ab8597c6a11a795e444d45da0abf43bbb2eaa6dae8"} Apr 28 19:16:08.641910 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.641890 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:16:08.642029 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.642015 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:16:08.642103 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.642036 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:16:08.643028 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.643006 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9vshb" event={"ID":"a521c111-aa4b-4eda-b86b-7b9f76fcd75f","Type":"ContainerStarted","Data":"cf789b56ac092ac8000610a62cb7cef2e034194ec63b4359e404ffa01e999119"} Apr 28 19:16:08.663086 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.663065 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:16:08.663258 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.663241 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:16:08.680306 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.680271 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9vshb" podStartSLOduration=1.680258166 podStartE2EDuration="1.680258166s" podCreationTimestamp="2026-04-28 19:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:08.680080589 +0000 UTC m=+25.613750395" watchObservedRunningTime="2026-04-28 19:16:08.680258166 +0000 UTC m=+25.613927960" Apr 28 19:16:08.712577 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:08.712521 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" podStartSLOduration=8.219091261 podStartE2EDuration="24.712509746s" podCreationTimestamp="2026-04-28 19:15:44 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.975955829 +0000 UTC m=+2.909625609" lastFinishedPulling="2026-04-28 19:16:02.469374306 +0000 UTC m=+19.403044094" observedRunningTime="2026-04-28 19:16:08.710552356 +0000 UTC m=+25.644222173" watchObservedRunningTime="2026-04-28 19:16:08.712509746 +0000 UTC m=+25.646179540" Apr 28 19:16:09.498658 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.498629 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:09.498883 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:09.498749 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:09.498883 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.498790 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:09.498883 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:09.498859 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:09.577370 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.577265 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zj9qs"] Apr 28 19:16:09.579655 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.579633 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nfhpq"] Apr 28 19:16:09.646625 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.646599 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerStarted","Data":"8cac7bc2a68a18d3c988c4cf47065758d90648a25ec658b58b47c7e18e37f96a"} Apr 28 19:16:09.647098 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.646643 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:09.647098 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:09.646748 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:09.647098 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:09.647019 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:09.647251 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:09.647108 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:10.650079 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:10.650045 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c174017-a3dc-4241-8008-c41fd1ae8cec" containerID="8cac7bc2a68a18d3c988c4cf47065758d90648a25ec658b58b47c7e18e37f96a" exitCode=0 Apr 28 19:16:10.650441 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:10.650126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerDied","Data":"8cac7bc2a68a18d3c988c4cf47065758d90648a25ec658b58b47c7e18e37f96a"} Apr 28 19:16:11.498705 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:11.498627 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:11.498852 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:11.498631 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:11.498852 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:11.498776 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:11.498852 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:11.498826 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:11.654355 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:11.654325 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c174017-a3dc-4241-8008-c41fd1ae8cec" containerID="29bfecaa039f571bd1a2d21d6d5f3d94de29f6ebe12b9ffc346ed9ff88a34ec4" exitCode=0 Apr 28 19:16:11.654928 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:11.654386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerDied","Data":"29bfecaa039f571bd1a2d21d6d5f3d94de29f6ebe12b9ffc346ed9ff88a34ec4"} Apr 28 19:16:13.501380 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:13.500533 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:13.501380 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:13.500686 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfhpq" podUID="a559869f-dc8c-4397-aa54-b59c274faa74" Apr 28 19:16:13.501380 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:13.501159 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:13.501380 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:13.501312 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zj9qs" podUID="d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf" Apr 28 19:16:14.742678 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.742456 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pd44w"] Apr 28 19:16:14.792725 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.792688 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pd44w"] Apr 28 19:16:14.792901 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.792820 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.792970 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:14.792903 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pd44w" podUID="113829ad-8ae9-4887-9929-882aabb0a1cb" Apr 28 19:16:14.877279 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.877244 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/113829ad-8ae9-4887-9929-882aabb0a1cb-dbus\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.877467 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.877304 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.877467 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.877411 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/113829ad-8ae9-4887-9929-882aabb0a1cb-kubelet-config\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.977928 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.977894 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/113829ad-8ae9-4887-9929-882aabb0a1cb-dbus\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.978088 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.977950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.978088 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.978004 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/113829ad-8ae9-4887-9929-882aabb0a1cb-kubelet-config\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.978190 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.978113 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/113829ad-8ae9-4887-9929-882aabb0a1cb-kubelet-config\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:14.978267 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:14.978194 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:14.978320 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:14.978275 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret podName:113829ad-8ae9-4887-9929-882aabb0a1cb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:15.478256186 +0000 UTC m=+32.411925981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret") pod "global-pull-secret-syncer-pd44w" (UID: "113829ad-8ae9-4887-9929-882aabb0a1cb") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:14.978320 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:14.978273 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/113829ad-8ae9-4887-9929-882aabb0a1cb-dbus\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:15.376026 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.375988 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-121.ec2.internal" event="NodeReady" Apr 28 19:16:15.376248 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.376157 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:15.425728 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.425697 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6598fb7d78-gsbrc"] Apr 28 19:16:15.443717 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.443686 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd"] Apr 28 19:16:15.443883 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.443831 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.449712 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.449518 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:16:15.450024 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.449997 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:16:15.451260 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.451234 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b9m64\"" Apr 28 19:16:15.451509 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.451490 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:16:15.461784 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.461760 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp"] Apr 28 19:16:15.461923 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.461904 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.465745 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.465719 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 28 19:16:15.466184 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.466163 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.469272 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.469249 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 28 19:16:15.469390 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.469355 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:16:15.469556 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.469394 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zgpqd\"" Apr 28 19:16:15.469556 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.469467 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.477256 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.477233 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-fl75w"] Apr 28 19:16:15.477871 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.477483 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" Apr 28 19:16:15.482860 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.482836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:15.482990 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.482960 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:15.483058 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.483045 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret podName:113829ad-8ae9-4887-9929-882aabb0a1cb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.483026164 +0000 UTC m=+33.416695939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret") pod "global-pull-secret-syncer-pd44w" (UID: "113829ad-8ae9-4887-9929-882aabb0a1cb") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:15.484938 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.484913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-qrb2k\"" Apr 28 19:16:15.485048 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.485020 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.485615 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.485267 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.501387 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.501360 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-p6wm2"] Apr 28 19:16:15.501531 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.501508 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.501648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.501511 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:15.501705 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.501664 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:15.506577 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.506558 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:15.506681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.506604 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.506681 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.506672 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 28 19:16:15.507037 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507018 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.507136 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507040 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.507136 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507097 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.507426 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507409 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6mdm8\"" Apr 28 19:16:15.507757 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507559 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n7p66\"" Apr 28 19:16:15.507757 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507622 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 28 19:16:15.507757 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.507727 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vvntf\"" Apr 28 19:16:15.517295 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.517273 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.521895 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.521871 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp"] Apr 28 19:16:15.522216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.522190 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.522321 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.522276 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.522321 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.522315 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 28 19:16:15.522918 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.522899 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 28 19:16:15.523029 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.522899 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9nbhb\"" Apr 28 19:16:15.524121 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.524100 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 28 19:16:15.527821 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.527800 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 28 19:16:15.549972 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.549950 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf"] Apr 28 19:16:15.550110 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.550092 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:15.553898 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.553871 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g5crm\"" Apr 28 19:16:15.554005 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.553908 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 28 19:16:15.554223 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.554205 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.554347 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.554325 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.573550 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573526 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp"] Apr 28 19:16:15.573662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573576 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6598fb7d78-gsbrc"] Apr 28 19:16:15.573662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573582 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.573662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573592 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf"] Apr 28 19:16:15.573662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573604 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp"] Apr 28 19:16:15.573662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573616 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-p6wm2"] Apr 28 19:16:15.573662 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.573629 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xzw6d"] Apr 28 19:16:15.578161 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.578141 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 28 19:16:15.578255 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.578170 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.578255 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.578197 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9hskh\"" Apr 28 19:16:15.578586 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.578571 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.578844 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.578826 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 28 19:16:15.583879 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.583857 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd439354-dd75-46f4-9b42-13a91c27d851-serving-cert\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.584010 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.583898 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbs9j\" (UniqueName: \"kubernetes.io/projected/29ca57b8-ae4f-4261-a245-529c6cfa8449-kube-api-access-lbs9j\") pod \"volume-data-source-validator-7c6cbb6c87-7r6rp\" (UID: \"29ca57b8-ae4f-4261-a245-529c6cfa8449\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" Apr 28 19:16:15.584010 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.583925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.584010 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.583950 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584010 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.583990 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-installation-pull-secrets\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584010 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584008 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-bound-sa-token\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584023 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bd439354-dd75-46f4-9b42-13a91c27d851-snapshots\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584041 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nknw\" (UniqueName: \"kubernetes.io/projected/bd439354-dd75-46f4-9b42-13a91c27d851-kube-api-access-7nknw\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584064 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbtb\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-kube-api-access-pwbtb\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584088 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-image-registry-private-configuration\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584173 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd439354-dd75-46f4-9b42-13a91c27d851-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584210 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd439354-dd75-46f4-9b42-13a91c27d851-service-ca-bundle\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.584265 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584257 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpk2m\" (UniqueName: \"kubernetes.io/projected/a4826c50-2384-48cf-853b-ab348926b6e5-kube-api-access-gpk2m\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.584553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584294 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a4826c50-2384-48cf-853b-ab348926b6e5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.584553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584320 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9861e6-21f7-4675-9370-4fa19956dd76-ca-trust-extracted\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584347 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-registry-certificates\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584373 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-trusted-ca\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.584553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.584395 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd439354-dd75-46f4-9b42-13a91c27d851-tmp\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.589060 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.589039 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd"] Apr 28 19:16:15.589150 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.589071 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-l29s7"] Apr 28 19:16:15.589201 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.589155 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.591692 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.591674 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:15.591927 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.591909 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:15.592031 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.591973 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xdh6h\"" Apr 28 19:16:15.603251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.603233 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-fl75w"] Apr 28 19:16:15.603343 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.603260 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t"] Apr 28 19:16:15.603396 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.603378 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:15.608557 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.608533 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 28 19:16:15.608683 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.608661 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 28 19:16:15.609277 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.609260 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-t8rzl\"" Apr 28 19:16:15.614713 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.614675 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9"] Apr 28 19:16:15.614838 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.614823 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.617343 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.617326 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-r79h5\"" Apr 28 19:16:15.617966 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.617946 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.618081 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.617946 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 28 19:16:15.618140 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.617947 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 28 19:16:15.618376 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.618358 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.626815 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.626763 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d5c596596-vxxm2"] Apr 28 19:16:15.626934 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.626916 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" Apr 28 19:16:15.629664 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.629649 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-45ckq\"" Apr 28 19:16:15.638736 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.638719 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xzw6d"] Apr 28 19:16:15.638831 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.638740 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-l29s7"] Apr 28 19:16:15.638831 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.638755 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t24nc"] Apr 28 19:16:15.638933 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.638868 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.642718 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.642699 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 28 19:16:15.642718 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.642712 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.642865 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.642799 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.643067 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.643047 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 28 19:16:15.643150 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.643109 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 28 19:16:15.643205 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.643181 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 28 19:16:15.643269 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.643252 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-d8gn2\"" Apr 28 19:16:15.651099 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.651079 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t"] Apr 28 19:16:15.651167 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.651106 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d5c596596-vxxm2"] Apr 28 19:16:15.651167 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.651119 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9"] Apr 28 19:16:15.651167 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.651131 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t24nc"] Apr 28 19:16:15.651275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.651186 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:15.656304 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.656287 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.656497 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.656306 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:15.656617 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.656356 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.656685 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.656363 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qbgzj\"" Apr 28 19:16:15.661757 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.661741 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:15.664542 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.664523 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:15.684866 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.684844 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd439354-dd75-46f4-9b42-13a91c27d851-serving-cert\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.684991 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.684885 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bd439354-dd75-46f4-9b42-13a91c27d851-snapshots\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.684991 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.684909 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:15.684991 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.684937 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-bound-sa-token\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.684991 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.684961 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nknw\" (UniqueName: \"kubernetes.io/projected/bd439354-dd75-46f4-9b42-13a91c27d851-kube-api-access-7nknw\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.685216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685023 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f81f5464-aa16-489d-80bf-9e5bf953f7af-serving-cert\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.685216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-image-registry-private-configuration\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.685216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685099 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd439354-dd75-46f4-9b42-13a91c27d851-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.685216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685126 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:15.685216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685155 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffsm\" (UniqueName: \"kubernetes.io/projected/f81f5464-aa16-489d-80bf-9e5bf953f7af-kube-api-access-mffsm\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.685438 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685273 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlq2r\" (UniqueName: \"kubernetes.io/projected/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-kube-api-access-nlq2r\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.685438 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685341 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-serving-cert\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.685438 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a4826c50-2384-48cf-853b-ab348926b6e5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.685438 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9861e6-21f7-4675-9370-4fa19956dd76-ca-trust-extracted\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.685630 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a1e12e-6bf9-4548-83dd-a69765a8c24d-config-volume\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.685630 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685544 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81f5464-aa16-489d-80bf-9e5bf953f7af-config\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.685630 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685578 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-installation-pull-secrets\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.685827 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685758 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9861e6-21f7-4675-9370-4fa19956dd76-ca-trust-extracted\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.685827 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685818 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-config\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.685910 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685837 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85a1e12e-6bf9-4548-83dd-a69765a8c24d-tmp-dir\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.685910 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbs9j\" (UniqueName: \"kubernetes.io/projected/29ca57b8-ae4f-4261-a245-529c6cfa8449-kube-api-access-lbs9j\") pod \"volume-data-source-validator-7c6cbb6c87-7r6rp\" (UID: \"29ca57b8-ae4f-4261-a245-529c6cfa8449\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" Apr 28 19:16:15.685910 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbtb\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-kube-api-access-pwbtb\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.685910 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685906 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.686089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685931 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.686089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685949 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f81f5464-aa16-489d-80bf-9e5bf953f7af-trusted-ca\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.686089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.685967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.686089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686022 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2cbf\" (UniqueName: \"kubernetes.io/projected/85a1e12e-6bf9-4548-83dd-a69765a8c24d-kube-api-access-s2cbf\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.686089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42b7e7a8-ff86-48ac-bd59-aab3db697272-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:15.686089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686084 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd439354-dd75-46f4-9b42-13a91c27d851-service-ca-bundle\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.686330 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686096 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd439354-dd75-46f4-9b42-13a91c27d851-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.686330 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686110 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpk2m\" (UniqueName: \"kubernetes.io/projected/a4826c50-2384-48cf-853b-ab348926b6e5-kube-api-access-gpk2m\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.686330 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5g7d\" (UniqueName: \"kubernetes.io/projected/2e2cbc2f-671d-4390-96cf-52b82b4e889a-kube-api-access-w5g7d\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:15.686330 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.686203 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:15.686330 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.686273 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:15.686330 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.686284 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6598fb7d78-gsbrc: secret "image-registry-tls" not found Apr 28 19:16:15.686578 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.686340 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls podName:de9861e6-21f7-4675-9370-4fa19956dd76 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.186323213 +0000 UTC m=+33.119992990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls") pod "image-registry-6598fb7d78-gsbrc" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76") : secret "image-registry-tls" not found Apr 28 19:16:15.686578 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.686446 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls podName:a4826c50-2384-48cf-853b-ab348926b6e5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.186427633 +0000 UTC m=+33.120097425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ml5vd" (UID: "a4826c50-2384-48cf-853b-ab348926b6e5") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:15.686578 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-registry-certificates\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.686578 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686496 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-trusted-ca\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.686578 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686559 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd439354-dd75-46f4-9b42-13a91c27d851-tmp\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.686798 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.686776 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd439354-dd75-46f4-9b42-13a91c27d851-service-ca-bundle\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.687293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.687271 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-trusted-ca\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.690027 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.689969 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd439354-dd75-46f4-9b42-13a91c27d851-serving-cert\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.690027 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.690016 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-image-registry-private-configuration\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.690165 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.690024 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-installation-pull-secrets\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.695731 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.695667 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bd439354-dd75-46f4-9b42-13a91c27d851-snapshots\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.695959 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.695852 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd439354-dd75-46f4-9b42-13a91c27d851-tmp\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.695959 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.695902 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a4826c50-2384-48cf-853b-ab348926b6e5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.698755 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.698732 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-registry-certificates\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.703079 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.703052 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbs9j\" (UniqueName: \"kubernetes.io/projected/29ca57b8-ae4f-4261-a245-529c6cfa8449-kube-api-access-lbs9j\") pod \"volume-data-source-validator-7c6cbb6c87-7r6rp\" (UID: \"29ca57b8-ae4f-4261-a245-529c6cfa8449\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" Apr 28 19:16:15.704041 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.704023 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nknw\" (UniqueName: \"kubernetes.io/projected/bd439354-dd75-46f4-9b42-13a91c27d851-kube-api-access-7nknw\") pod \"insights-operator-585dfdc468-fl75w\" (UID: \"bd439354-dd75-46f4-9b42-13a91c27d851\") " pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.704041 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.704036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-bound-sa-token\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.705338 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.705318 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbtb\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-kube-api-access-pwbtb\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:15.705522 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.705497 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpk2m\" (UniqueName: \"kubernetes.io/projected/a4826c50-2384-48cf-853b-ab348926b6e5-kube-api-access-gpk2m\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:15.787876 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.787843 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xk2n\" (UniqueName: \"kubernetes.io/projected/ae143e08-b979-475a-abb9-654fdf653811-kube-api-access-9xk2n\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.787903 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f81f5464-aa16-489d-80bf-9e5bf953f7af-trusted-ca\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.787991 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788034 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.788073 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2cbf\" (UniqueName: \"kubernetes.io/projected/85a1e12e-6bf9-4548-83dd-a69765a8c24d-kube-api-access-s2cbf\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.788133 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls podName:85a1e12e-6bf9-4548-83dd-a69765a8c24d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.288114392 +0000 UTC m=+33.221784168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls") pod "dns-default-xzw6d" (UID: "85a1e12e-6bf9-4548-83dd-a69765a8c24d") : secret "dns-default-metrics-tls" not found Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788168 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42b7e7a8-ff86-48ac-bd59-aab3db697272-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788217 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5g7d\" (UniqueName: \"kubernetes.io/projected/2e2cbc2f-671d-4390-96cf-52b82b4e889a-kube-api-access-w5g7d\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788248 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5vz\" (UniqueName: \"kubernetes.io/projected/d08c9e66-fe95-42f4-be54-32ce7e41a44e-kube-api-access-fr5vz\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788278 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788308 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08c9e66-fe95-42f4-be54-32ce7e41a44e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788333 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788350 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:15.788381 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-default-certificate\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788413 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f81f5464-aa16-489d-80bf-9e5bf953f7af-serving-cert\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-stats-auth\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d08c9e66-fe95-42f4-be54-32ce7e41a44e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788522 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788551 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mffsm\" (UniqueName: \"kubernetes.io/projected/f81f5464-aa16-489d-80bf-9e5bf953f7af-kube-api-access-mffsm\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788582 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlq2r\" (UniqueName: \"kubernetes.io/projected/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-kube-api-access-nlq2r\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-serving-cert\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788642 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zzdm\" (UniqueName: \"kubernetes.io/projected/97ec2c5a-89fa-4b8f-9919-ad126433ee06-kube-api-access-2zzdm\") pod \"network-check-source-8894fc9bd-9kcz9\" (UID: \"97ec2c5a-89fa-4b8f-9919-ad126433ee06\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788728 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a1e12e-6bf9-4548-83dd-a69765a8c24d-config-volume\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788738 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f81f5464-aa16-489d-80bf-9e5bf953f7af-trusted-ca\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81f5464-aa16-489d-80bf-9e5bf953f7af-config\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788802 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvjd\" (UniqueName: \"kubernetes.io/projected/cd28326f-9576-4715-a0b0-f6812113130a-kube-api-access-8tvjd\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-config\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.789171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.788865 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85a1e12e-6bf9-4548-83dd-a69765a8c24d-tmp-dir\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.789400 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.789409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42b7e7a8-ff86-48ac-bd59-aab3db697272-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.789459 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert podName:42b7e7a8-ff86-48ac-bd59-aab3db697272 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.289444094 +0000 UTC m=+33.223113870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l29s7" (UID: "42b7e7a8-ff86-48ac-bd59-aab3db697272") : secret "networking-console-plugin-cert" not found Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.789485 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.789519 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls podName:2e2cbc2f-671d-4390-96cf-52b82b4e889a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.289508587 +0000 UTC m=+33.223178372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-slrjp" (UID: "2e2cbc2f-671d-4390-96cf-52b82b4e889a") : secret "samples-operator-tls" not found Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.789772 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85a1e12e-6bf9-4548-83dd-a69765a8c24d-tmp-dir\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.789904 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.789829 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81f5464-aa16-489d-80bf-9e5bf953f7af-config\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.790262 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.789907 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a1e12e-6bf9-4548-83dd-a69765a8c24d-config-volume\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.790262 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.789994 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-config\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.791717 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.791686 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f81f5464-aa16-489d-80bf-9e5bf953f7af-serving-cert\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.800294 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.800269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-serving-cert\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.808459 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.808438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2cbf\" (UniqueName: \"kubernetes.io/projected/85a1e12e-6bf9-4548-83dd-a69765a8c24d-kube-api-access-s2cbf\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:15.814455 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.814433 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlq2r\" (UniqueName: \"kubernetes.io/projected/f2b08f0e-bfe6-4201-9b2f-80bb5473ba65-kube-api-access-nlq2r\") pod \"service-ca-operator-d6fc45fc5-v5vpf\" (UID: \"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.815217 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.815191 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5g7d\" (UniqueName: \"kubernetes.io/projected/2e2cbc2f-671d-4390-96cf-52b82b4e889a-kube-api-access-w5g7d\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:15.815287 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.815246 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffsm\" (UniqueName: \"kubernetes.io/projected/f81f5464-aa16-489d-80bf-9e5bf953f7af-kube-api-access-mffsm\") pod \"console-operator-9d4b6777b-p6wm2\" (UID: \"f81f5464-aa16-489d-80bf-9e5bf953f7af\") " pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.818915 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.818895 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-fl75w" Apr 28 19:16:15.844135 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.844118 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:15.884049 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.883964 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" Apr 28 19:16:15.889890 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.889866 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5vz\" (UniqueName: \"kubernetes.io/projected/d08c9e66-fe95-42f4-be54-32ce7e41a44e-kube-api-access-fr5vz\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.890014 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.889899 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.890014 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.889921 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08c9e66-fe95-42f4-be54-32ce7e41a44e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.890014 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.889964 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-default-certificate\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.890014 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890011 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-stats-auth\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.890171 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.890044 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:16:15.890171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890049 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d08c9e66-fe95-42f4-be54-32ce7e41a44e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.890171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890090 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zzdm\" (UniqueName: \"kubernetes.io/projected/97ec2c5a-89fa-4b8f-9919-ad126433ee06-kube-api-access-2zzdm\") pod \"network-check-source-8894fc9bd-9kcz9\" (UID: \"97ec2c5a-89fa-4b8f-9919-ad126433ee06\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" Apr 28 19:16:15.890171 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890155 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvjd\" (UniqueName: \"kubernetes.io/projected/cd28326f-9576-4715-a0b0-f6812113130a-kube-api-access-8tvjd\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:15.890324 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890197 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xk2n\" (UniqueName: \"kubernetes.io/projected/ae143e08-b979-475a-abb9-654fdf653811-kube-api-access-9xk2n\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.890324 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890256 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.890324 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890288 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:15.890429 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.890382 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:15.890471 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.890437 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert podName:cd28326f-9576-4715-a0b0-f6812113130a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.390419688 +0000 UTC m=+33.324089482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert") pod "ingress-canary-t24nc" (UID: "cd28326f-9576-4715-a0b0-f6812113130a") : secret "canary-serving-cert" not found Apr 28 19:16:15.890532 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.890499 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08c9e66-fe95-42f4-be54-32ce7e41a44e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.890579 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.890556 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.390537524 +0000 UTC m=+33.324207308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : secret "router-metrics-certs-default" not found Apr 28 19:16:15.890694 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:15.890673 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.390656712 +0000 UTC m=+33.324326486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : configmap references non-existent config key: service-ca.crt Apr 28 19:16:15.892766 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.892740 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d08c9e66-fe95-42f4-be54-32ce7e41a44e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.892863 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.892844 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-default-certificate\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.892944 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.892925 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-stats-auth\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.906852 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.906831 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvjd\" (UniqueName: \"kubernetes.io/projected/cd28326f-9576-4715-a0b0-f6812113130a-kube-api-access-8tvjd\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:15.909407 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.909384 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zzdm\" (UniqueName: \"kubernetes.io/projected/97ec2c5a-89fa-4b8f-9919-ad126433ee06-kube-api-access-2zzdm\") pod \"network-check-source-8894fc9bd-9kcz9\" (UID: \"97ec2c5a-89fa-4b8f-9919-ad126433ee06\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" Apr 28 19:16:15.911016 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.910997 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xk2n\" (UniqueName: \"kubernetes.io/projected/ae143e08-b979-475a-abb9-654fdf653811-kube-api-access-9xk2n\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:15.911129 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.911110 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5vz\" (UniqueName: \"kubernetes.io/projected/d08c9e66-fe95-42f4-be54-32ce7e41a44e-kube-api-access-fr5vz\") pod \"kube-storage-version-migrator-operator-6769c5d45-r5k9t\" (UID: \"d08c9e66-fe95-42f4-be54-32ce7e41a44e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.927857 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.927830 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" Apr 28 19:16:15.937054 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:15.937031 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" Apr 28 19:16:16.192775 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.192680 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:16.192775 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.192727 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:16.192989 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.192843 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:16.192989 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.192926 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls podName:a4826c50-2384-48cf-853b-ab348926b6e5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.192905329 +0000 UTC m=+34.126575113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ml5vd" (UID: "a4826c50-2384-48cf-853b-ab348926b6e5") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:16.192989 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.192932 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:16.192989 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.192955 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6598fb7d78-gsbrc: secret "image-registry-tls" not found Apr 28 19:16:16.193176 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.193037 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls podName:de9861e6-21f7-4675-9370-4fa19956dd76 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.193019585 +0000 UTC m=+34.126689366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls") pod "image-registry-6598fb7d78-gsbrc" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76") : secret "image-registry-tls" not found Apr 28 19:16:16.293900 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.293858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:16.294097 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.294007 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:16.294097 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.294036 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:16.294097 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.294063 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:16.294247 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.294112 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls podName:85a1e12e-6bf9-4548-83dd-a69765a8c24d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.294093434 +0000 UTC m=+34.227763234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls") pod "dns-default-xzw6d" (UID: "85a1e12e-6bf9-4548-83dd-a69765a8c24d") : secret "dns-default-metrics-tls" not found Apr 28 19:16:16.294247 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.294161 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:16:16.294247 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.294210 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:16.294247 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.294228 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls podName:2e2cbc2f-671d-4390-96cf-52b82b4e889a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.294210432 +0000 UTC m=+34.227880207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-slrjp" (UID: "2e2cbc2f-671d-4390-96cf-52b82b4e889a") : secret "samples-operator-tls" not found Apr 28 19:16:16.294435 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.294260 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert podName:42b7e7a8-ff86-48ac-bd59-aab3db697272 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.294244449 +0000 UTC m=+34.227914229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l29s7" (UID: "42b7e7a8-ff86-48ac-bd59-aab3db697272") : secret "networking-console-plugin-cert" not found Apr 28 19:16:16.394824 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.394789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:16.394824 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.394831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:16.395104 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.394891 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:16.395104 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.394958 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.394939369 +0000 UTC m=+34.328609146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : configmap references non-existent config key: service-ca.crt Apr 28 19:16:16.395104 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.395029 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:16.395104 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.395024 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:16:16.395104 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.395102 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert podName:cd28326f-9576-4715-a0b0-f6812113130a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.395087157 +0000 UTC m=+34.328756930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert") pod "ingress-canary-t24nc" (UID: "cd28326f-9576-4715-a0b0-f6812113130a") : secret "canary-serving-cert" not found Apr 28 19:16:16.395360 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:16.395122 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.395112868 +0000 UTC m=+34.328782644 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : secret "router-metrics-certs-default" not found Apr 28 19:16:16.496125 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.496036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:16.514841 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.514812 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/113829ad-8ae9-4887-9929-882aabb0a1cb-original-pull-secret\") pod \"global-pull-secret-syncer-pd44w\" (UID: \"113829ad-8ae9-4887-9929-882aabb0a1cb\") " pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:16.570707 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:16.570670 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pd44w" Apr 28 19:16:17.203883 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.203843 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:17.203883 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.203884 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204026 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204039 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6598fb7d78-gsbrc: secret "image-registry-tls" not found Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.204055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204094 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls podName:de9861e6-21f7-4675-9370-4fa19956dd76 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.204074642 +0000 UTC m=+36.137744449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls") pod "image-registry-6598fb7d78-gsbrc" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76") : secret "image-registry-tls" not found Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204027 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204125 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204182 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls podName:a4826c50-2384-48cf-853b-ab348926b6e5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.204160959 +0000 UTC m=+36.137830739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ml5vd" (UID: "a4826c50-2384-48cf-853b-ab348926b6e5") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:17.204345 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.204226 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs podName:a559869f-dc8c-4397-aa54-b59c274faa74 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:49.204215417 +0000 UTC m=+66.137885193 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs") pod "network-metrics-daemon-nfhpq" (UID: "a559869f-dc8c-4397-aa54-b59c274faa74") : secret "metrics-daemon-secret" not found Apr 28 19:16:17.304574 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.304540 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:17.304748 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.304660 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:17.304748 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.304705 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:17.304748 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.304717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:17.304858 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.304770 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls podName:85a1e12e-6bf9-4548-83dd-a69765a8c24d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.304754002 +0000 UTC m=+36.238423776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls") pod "dns-default-xzw6d" (UID: "85a1e12e-6bf9-4548-83dd-a69765a8c24d") : secret "dns-default-metrics-tls" not found Apr 28 19:16:17.304858 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.304814 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:16:17.304858 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.304824 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:17.305004 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.304878 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls podName:2e2cbc2f-671d-4390-96cf-52b82b4e889a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.30486238 +0000 UTC m=+36.238532157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-slrjp" (UID: "2e2cbc2f-671d-4390-96cf-52b82b4e889a") : secret "samples-operator-tls" not found Apr 28 19:16:17.305004 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.304897 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert podName:42b7e7a8-ff86-48ac-bd59-aab3db697272 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.304889724 +0000 UTC m=+36.238559501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l29s7" (UID: "42b7e7a8-ff86-48ac-bd59-aab3db697272") : secret "networking-console-plugin-cert" not found Apr 28 19:16:17.405677 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.405641 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:17.405677 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.405682 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:17.405872 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.405711 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:17.405872 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.405766 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:17.405872 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.405801 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:17.405872 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.405825 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.405805654 +0000 UTC m=+36.339475428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : configmap references non-existent config key: service-ca.crt Apr 28 19:16:17.405872 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.405845 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert podName:cd28326f-9576-4715-a0b0-f6812113130a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.405834856 +0000 UTC m=+36.339504635 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert") pod "ingress-canary-t24nc" (UID: "cd28326f-9576-4715-a0b0-f6812113130a") : secret "canary-serving-cert" not found Apr 28 19:16:17.406067 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.405872 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:16:17.406067 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:17.405932 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.405917033 +0000 UTC m=+36.339586807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : secret "router-metrics-certs-default" not found Apr 28 19:16:17.408087 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.408065 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98gww\" (UniqueName: \"kubernetes.io/projected/d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf-kube-api-access-98gww\") pod \"network-check-target-zj9qs\" (UID: \"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf\") " pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:17.631719 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.631147 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:17.723964 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.723895 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t"] Apr 28 19:16:17.728734 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.728693 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-fl75w"] Apr 28 19:16:17.734629 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.734590 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9"] Apr 28 19:16:17.753697 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.753674 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-p6wm2"] Apr 28 19:16:17.762443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.762412 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pd44w"] Apr 28 19:16:17.770822 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.770801 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp"] Apr 28 19:16:17.772033 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.772013 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf"] Apr 28 19:16:17.798325 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:17.798301 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zj9qs"] Apr 28 19:16:17.823639 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.823613 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd08c9e66_fe95_42f4_be54_32ce7e41a44e.slice/crio-99f4c3f62f51598fae88af1b377411c8668e7c7ff8807d81c5dd740ea8bee2ec WatchSource:0}: Error finding container 99f4c3f62f51598fae88af1b377411c8668e7c7ff8807d81c5dd740ea8bee2ec: Status 404 returned error can't find the container with id 99f4c3f62f51598fae88af1b377411c8668e7c7ff8807d81c5dd740ea8bee2ec Apr 28 19:16:17.824050 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.824027 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd439354_dd75_46f4_9b42_13a91c27d851.slice/crio-51040080e40bfeb044d315073a7508de306520e1cc2b4542aef17b194df23823 WatchSource:0}: Error finding container 51040080e40bfeb044d315073a7508de306520e1cc2b4542aef17b194df23823: Status 404 returned error can't find the container with id 51040080e40bfeb044d315073a7508de306520e1cc2b4542aef17b194df23823 Apr 28 19:16:17.825302 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.824836 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ec2c5a_89fa_4b8f_9919_ad126433ee06.slice/crio-f8bd88cc76600bf2e18cf44574b594418fb91aef4a8d543c969ba545840d3330 WatchSource:0}: Error finding container f8bd88cc76600bf2e18cf44574b594418fb91aef4a8d543c969ba545840d3330: Status 404 returned error can't find the container with id f8bd88cc76600bf2e18cf44574b594418fb91aef4a8d543c969ba545840d3330 Apr 28 19:16:17.826474 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.826445 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81f5464_aa16_489d_80bf_9e5bf953f7af.slice/crio-8965ac4181d51cffc7b3d87389b1e383e9dd27d8631b2dc790aeeeb61ed3da06 WatchSource:0}: Error finding container 8965ac4181d51cffc7b3d87389b1e383e9dd27d8631b2dc790aeeeb61ed3da06: Status 404 returned error can't find the container with id 8965ac4181d51cffc7b3d87389b1e383e9dd27d8631b2dc790aeeeb61ed3da06 Apr 28 19:16:17.827381 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.827360 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2b08f0e_bfe6_4201_9b2f_80bb5473ba65.slice/crio-6e4088d60f87a11156d8fc86b74d9dc25c8a2c14bfb44d6d14441498fe5eb136 WatchSource:0}: Error finding container 6e4088d60f87a11156d8fc86b74d9dc25c8a2c14bfb44d6d14441498fe5eb136: Status 404 returned error can't find the container with id 6e4088d60f87a11156d8fc86b74d9dc25c8a2c14bfb44d6d14441498fe5eb136 Apr 28 19:16:17.840423 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.840398 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113829ad_8ae9_4887_9929_882aabb0a1cb.slice/crio-13c0e6e0599d41cb2811f187ecdb7615ed6ae5803c10051156d6e8eafac50ff8 WatchSource:0}: Error finding container 13c0e6e0599d41cb2811f187ecdb7615ed6ae5803c10051156d6e8eafac50ff8: Status 404 returned error can't find the container with id 13c0e6e0599d41cb2811f187ecdb7615ed6ae5803c10051156d6e8eafac50ff8 Apr 28 19:16:17.841865 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.841431 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29ca57b8_ae4f_4261_a245_529c6cfa8449.slice/crio-3857a96cd07c76815ff419ae59aa1f9358f64d7df2b30083649bef3e6a5dfe55 WatchSource:0}: Error finding container 3857a96cd07c76815ff419ae59aa1f9358f64d7df2b30083649bef3e6a5dfe55: Status 404 returned error can't find the container with id 3857a96cd07c76815ff419ae59aa1f9358f64d7df2b30083649bef3e6a5dfe55 Apr 28 19:16:17.842412 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:17.842364 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3913cb5_cdc7_4e4c_9f54_04992f3a0bcf.slice/crio-c4cc8e119725381a6416a95f8e4484206554a88987fc797751a685a215a5b4c5 WatchSource:0}: Error finding container c4cc8e119725381a6416a95f8e4484206554a88987fc797751a685a215a5b4c5: Status 404 returned error can't find the container with id c4cc8e119725381a6416a95f8e4484206554a88987fc797751a685a215a5b4c5 Apr 28 19:16:18.679129 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.679067 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" event={"ID":"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65","Type":"ContainerStarted","Data":"6e4088d60f87a11156d8fc86b74d9dc25c8a2c14bfb44d6d14441498fe5eb136"} Apr 28 19:16:18.688300 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.687207 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c174017-a3dc-4241-8008-c41fd1ae8cec" containerID="338435a6ec5e330f3ed5d272ef7ce03dc6b14b23a1d4934347dabd349b313fcf" exitCode=0 Apr 28 19:16:18.688300 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.687292 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerDied","Data":"338435a6ec5e330f3ed5d272ef7ce03dc6b14b23a1d4934347dabd349b313fcf"} Apr 28 19:16:18.700862 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.700816 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zj9qs" event={"ID":"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf","Type":"ContainerStarted","Data":"c4cc8e119725381a6416a95f8e4484206554a88987fc797751a685a215a5b4c5"} Apr 28 19:16:18.705551 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.705524 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" event={"ID":"d08c9e66-fe95-42f4-be54-32ce7e41a44e","Type":"ContainerStarted","Data":"99f4c3f62f51598fae88af1b377411c8668e7c7ff8807d81c5dd740ea8bee2ec"} Apr 28 19:16:18.710008 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.709957 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fl75w" event={"ID":"bd439354-dd75-46f4-9b42-13a91c27d851","Type":"ContainerStarted","Data":"51040080e40bfeb044d315073a7508de306520e1cc2b4542aef17b194df23823"} Apr 28 19:16:18.717828 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.717780 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pd44w" event={"ID":"113829ad-8ae9-4887-9929-882aabb0a1cb","Type":"ContainerStarted","Data":"13c0e6e0599d41cb2811f187ecdb7615ed6ae5803c10051156d6e8eafac50ff8"} Apr 28 19:16:18.722624 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.721531 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" event={"ID":"f81f5464-aa16-489d-80bf-9e5bf953f7af","Type":"ContainerStarted","Data":"8965ac4181d51cffc7b3d87389b1e383e9dd27d8631b2dc790aeeeb61ed3da06"} Apr 28 19:16:18.722734 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.722697 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" event={"ID":"97ec2c5a-89fa-4b8f-9919-ad126433ee06","Type":"ContainerStarted","Data":"f8bd88cc76600bf2e18cf44574b594418fb91aef4a8d543c969ba545840d3330"} Apr 28 19:16:18.726606 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:18.726580 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" event={"ID":"29ca57b8-ae4f-4261-a245-529c6cfa8449","Type":"ContainerStarted","Data":"3857a96cd07c76815ff419ae59aa1f9358f64d7df2b30083649bef3e6a5dfe55"} Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.227865 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.227914 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.228091 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.228115 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.228127 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6598fb7d78-gsbrc: secret "image-registry-tls" not found Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.228161 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls podName:a4826c50-2384-48cf-853b-ab348926b6e5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.228143034 +0000 UTC m=+40.161812826 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ml5vd" (UID: "a4826c50-2384-48cf-853b-ab348926b6e5") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:19.228323 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.228181 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls podName:de9861e6-21f7-4675-9370-4fa19956dd76 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.228171681 +0000 UTC m=+40.161841454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls") pod "image-registry-6598fb7d78-gsbrc" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76") : secret "image-registry-tls" not found Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.329292 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.329356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.329446 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.329590 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.329658 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls podName:85a1e12e-6bf9-4548-83dd-a69765a8c24d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.329640313 +0000 UTC m=+40.263310092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls") pod "dns-default-xzw6d" (UID: "85a1e12e-6bf9-4548-83dd-a69765a8c24d") : secret "dns-default-metrics-tls" not found Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.329730 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.329768 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls podName:2e2cbc2f-671d-4390-96cf-52b82b4e889a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.329757469 +0000 UTC m=+40.263427246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-slrjp" (UID: "2e2cbc2f-671d-4390-96cf-52b82b4e889a") : secret "samples-operator-tls" not found Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.329820 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:19.330208 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.329850 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert podName:42b7e7a8-ff86-48ac-bd59-aab3db697272 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.329840951 +0000 UTC m=+40.263510730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l29s7" (UID: "42b7e7a8-ff86-48ac-bd59-aab3db697272") : secret "networking-console-plugin-cert" not found Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.430797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.430843 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.430963 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.431114 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.431189 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.431168324 +0000 UTC m=+40.364838100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : secret "router-metrics-certs-default" not found Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.431504 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.431551 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert podName:cd28326f-9576-4715-a0b0-f6812113130a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.431534299 +0000 UTC m=+40.365204079 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert") pod "ingress-canary-t24nc" (UID: "cd28326f-9576-4715-a0b0-f6812113130a") : secret "canary-serving-cert" not found Apr 28 19:16:19.431648 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:19.431618 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.431607691 +0000 UTC m=+40.365277467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : configmap references non-existent config key: service-ca.crt Apr 28 19:16:19.737301 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.737264 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c174017-a3dc-4241-8008-c41fd1ae8cec" containerID="bd3ed446bea723ea88a27bd6cccb0179edfb740b1ee345d235854e903a232de4" exitCode=0 Apr 28 19:16:19.737867 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:19.737339 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerDied","Data":"bd3ed446bea723ea88a27bd6cccb0179edfb740b1ee345d235854e903a232de4"} Apr 28 19:16:20.748097 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:20.748056 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xt62h" event={"ID":"0c174017-a3dc-4241-8008-c41fd1ae8cec","Type":"ContainerStarted","Data":"ced61ac1594dde7e2bc6ab7a40f7d71a627978b3df0c928530f16a935d52864c"} Apr 28 19:16:20.775392 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:20.775334 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xt62h" podStartSLOduration=5.84575567 podStartE2EDuration="37.775314445s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.971498398 +0000 UTC m=+2.905168175" lastFinishedPulling="2026-04-28 19:16:17.901057176 +0000 UTC m=+34.834726950" observedRunningTime="2026-04-28 19:16:20.775188532 +0000 UTC m=+37.708858331" watchObservedRunningTime="2026-04-28 19:16:20.775314445 +0000 UTC m=+37.708984244" Apr 28 19:16:23.270241 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.270203 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:23.270241 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.270244 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:23.270708 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.270370 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:23.270708 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.270374 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:23.270708 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.270382 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6598fb7d78-gsbrc: secret "image-registry-tls" not found Apr 28 19:16:23.270708 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.270458 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls podName:a4826c50-2384-48cf-853b-ab348926b6e5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.270435577 +0000 UTC m=+48.204105368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ml5vd" (UID: "a4826c50-2384-48cf-853b-ab348926b6e5") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:23.270708 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.270500 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls podName:de9861e6-21f7-4675-9370-4fa19956dd76 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.270484425 +0000 UTC m=+48.204154199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls") pod "image-registry-6598fb7d78-gsbrc" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76") : secret "image-registry-tls" not found Apr 28 19:16:23.371139 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.371102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:23.371314 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.371205 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:23.371314 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.371263 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:23.371314 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.371297 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:23.371459 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.371326 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert podName:42b7e7a8-ff86-48ac-bd59-aab3db697272 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.371307924 +0000 UTC m=+48.304977705 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l29s7" (UID: "42b7e7a8-ff86-48ac-bd59-aab3db697272") : secret "networking-console-plugin-cert" not found Apr 28 19:16:23.371459 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.371366 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:16:23.371459 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.371370 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:23.371459 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.371401 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls podName:2e2cbc2f-671d-4390-96cf-52b82b4e889a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.371390188 +0000 UTC m=+48.305059967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-slrjp" (UID: "2e2cbc2f-671d-4390-96cf-52b82b4e889a") : secret "samples-operator-tls" not found Apr 28 19:16:23.371459 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.371438 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls podName:85a1e12e-6bf9-4548-83dd-a69765a8c24d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.371417938 +0000 UTC m=+48.305087714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls") pod "dns-default-xzw6d" (UID: "85a1e12e-6bf9-4548-83dd-a69765a8c24d") : secret "dns-default-metrics-tls" not found Apr 28 19:16:23.472190 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.472155 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:23.472396 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.472206 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:23.472396 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:23.472253 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:23.472396 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.472347 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.472322996 +0000 UTC m=+48.405992784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : configmap references non-existent config key: service-ca.crt Apr 28 19:16:23.472396 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.472388 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:16:23.472576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.472391 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:23.472576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.472447 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert podName:cd28326f-9576-4715-a0b0-f6812113130a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.472430673 +0000 UTC m=+48.406100447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert") pod "ingress-canary-t24nc" (UID: "cd28326f-9576-4715-a0b0-f6812113130a") : secret "canary-serving-cert" not found Apr 28 19:16:23.472576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:23.472464 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.472455426 +0000 UTC m=+48.406125200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : secret "router-metrics-certs-default" not found Apr 28 19:16:27.765588 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.765557 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/0.log" Apr 28 19:16:27.766087 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.765604 2565 generic.go:358] "Generic (PLEG): container finished" podID="f81f5464-aa16-489d-80bf-9e5bf953f7af" containerID="740042ccc744c173a287265fdd783c14f4e25cfa5624646f07a802b920f5561d" exitCode=255 Apr 28 19:16:27.766087 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.765673 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" event={"ID":"f81f5464-aa16-489d-80bf-9e5bf953f7af","Type":"ContainerDied","Data":"740042ccc744c173a287265fdd783c14f4e25cfa5624646f07a802b920f5561d"} Apr 28 19:16:27.766087 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.765935 2565 scope.go:117] "RemoveContainer" containerID="740042ccc744c173a287265fdd783c14f4e25cfa5624646f07a802b920f5561d" Apr 28 19:16:27.771177 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.771148 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" event={"ID":"97ec2c5a-89fa-4b8f-9919-ad126433ee06","Type":"ContainerStarted","Data":"bd31daaf0d75f25e1accf98d67aa8f4149827750c98bb8b07c33a73dcb67bfbd"} Apr 28 19:16:27.774013 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.773971 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" event={"ID":"29ca57b8-ae4f-4261-a245-529c6cfa8449","Type":"ContainerStarted","Data":"0fe564183d9ce310ee9883e4cfc8ed0d43fd2ee2b7aa200dc426f857606820ff"} Apr 28 19:16:27.775813 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.775784 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" event={"ID":"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65","Type":"ContainerStarted","Data":"a53e67160bffcd379ae97463e11c55c014575aa32aade6d73f521ce8b0a23d0f"} Apr 28 19:16:27.777040 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.777017 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zj9qs" event={"ID":"d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf","Type":"ContainerStarted","Data":"c2ea6d97de6733aded03b28769f97ebc96a89419c8557e222c0a62a394552ccf"} Apr 28 19:16:27.777144 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.777130 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:27.778505 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.778477 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" event={"ID":"d08c9e66-fe95-42f4-be54-32ce7e41a44e","Type":"ContainerStarted","Data":"8ade874ae562eac95454936bf69476cc7a468bf67723fd6c67dbce7faf37dcb3"} Apr 28 19:16:27.780313 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.780290 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fl75w" event={"ID":"bd439354-dd75-46f4-9b42-13a91c27d851","Type":"ContainerStarted","Data":"278c8fc5ead022d7c67e3d1af4525d60bcf9eab3817485be66a8ddc284d100e7"} Apr 28 19:16:27.782107 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.782086 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pd44w" event={"ID":"113829ad-8ae9-4887-9929-882aabb0a1cb","Type":"ContainerStarted","Data":"758aa7b1e2e61676d2d0d2140301600e300af686af34e979164d373a22e0ff49"} Apr 28 19:16:27.810015 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.809959 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" podStartSLOduration=32.946591936 podStartE2EDuration="41.809944096s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.829079103 +0000 UTC m=+34.762748877" lastFinishedPulling="2026-04-28 19:16:26.692431248 +0000 UTC m=+43.626101037" observedRunningTime="2026-04-28 19:16:27.807801796 +0000 UTC m=+44.741471590" watchObservedRunningTime="2026-04-28 19:16:27.809944096 +0000 UTC m=+44.743613933" Apr 28 19:16:27.831302 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.830281 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zj9qs" podStartSLOduration=35.676366014 podStartE2EDuration="44.830261844s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.87840635 +0000 UTC m=+34.812076127" lastFinishedPulling="2026-04-28 19:16:27.032302169 +0000 UTC m=+43.965971957" observedRunningTime="2026-04-28 19:16:27.82976917 +0000 UTC m=+44.763438969" watchObservedRunningTime="2026-04-28 19:16:27.830261844 +0000 UTC m=+44.763931632" Apr 28 19:16:27.884278 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.884102 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-fl75w" podStartSLOduration=32.679166203 podStartE2EDuration="41.884082545s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.826859113 +0000 UTC m=+34.760528900" lastFinishedPulling="2026-04-28 19:16:27.031775451 +0000 UTC m=+43.965445242" observedRunningTime="2026-04-28 19:16:27.858184286 +0000 UTC m=+44.791854082" watchObservedRunningTime="2026-04-28 19:16:27.884082545 +0000 UTC m=+44.817752342" Apr 28 19:16:27.909608 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.909552 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pd44w" podStartSLOduration=4.7427176840000005 podStartE2EDuration="13.909531882s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.878333547 +0000 UTC m=+34.812003320" lastFinishedPulling="2026-04-28 19:16:27.045147744 +0000 UTC m=+43.978817518" observedRunningTime="2026-04-28 19:16:27.885417284 +0000 UTC m=+44.819087071" watchObservedRunningTime="2026-04-28 19:16:27.909531882 +0000 UTC m=+44.843201678" Apr 28 19:16:27.909769 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.909644 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7r6rp" podStartSLOduration=33.158974864 podStartE2EDuration="41.909638409s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.878275233 +0000 UTC m=+34.811945010" lastFinishedPulling="2026-04-28 19:16:26.628938763 +0000 UTC m=+43.562608555" observedRunningTime="2026-04-28 19:16:27.907379278 +0000 UTC m=+44.841049078" watchObservedRunningTime="2026-04-28 19:16:27.909638409 +0000 UTC m=+44.843308208" Apr 28 19:16:27.928695 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.928634 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" podStartSLOduration=31.722188455 podStartE2EDuration="40.928617298s" podCreationTimestamp="2026-04-28 19:15:47 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.826006237 +0000 UTC m=+34.759676025" lastFinishedPulling="2026-04-28 19:16:27.032435084 +0000 UTC m=+43.966104868" observedRunningTime="2026-04-28 19:16:27.927060237 +0000 UTC m=+44.860730034" watchObservedRunningTime="2026-04-28 19:16:27.928617298 +0000 UTC m=+44.862287092" Apr 28 19:16:27.961163 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:27.961119 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9kcz9" podStartSLOduration=32.755923756 podStartE2EDuration="41.961104121s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.827585716 +0000 UTC m=+34.761255505" lastFinishedPulling="2026-04-28 19:16:27.032766093 +0000 UTC m=+43.966435870" observedRunningTime="2026-04-28 19:16:27.960068783 +0000 UTC m=+44.893738581" watchObservedRunningTime="2026-04-28 19:16:27.961104121 +0000 UTC m=+44.894773916" Apr 28 19:16:28.701241 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.701207 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d"] Apr 28 19:16:28.730251 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.730221 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d"] Apr 28 19:16:28.730416 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.730370 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" Apr 28 19:16:28.733409 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.733382 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:28.733533 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.733383 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 28 19:16:28.734524 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.734505 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vk5kj\"" Apr 28 19:16:28.786518 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.786490 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:16:28.786966 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.786874 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/0.log" Apr 28 19:16:28.786966 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.786905 2565 generic.go:358] "Generic (PLEG): container finished" podID="f81f5464-aa16-489d-80bf-9e5bf953f7af" containerID="f2037accf07ad777a8e4761b2f93dea7c27744ed1d6be0720529d85e52d9b34d" exitCode=255 Apr 28 19:16:28.786966 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.786953 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" event={"ID":"f81f5464-aa16-489d-80bf-9e5bf953f7af","Type":"ContainerDied","Data":"f2037accf07ad777a8e4761b2f93dea7c27744ed1d6be0720529d85e52d9b34d"} Apr 28 19:16:28.787148 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.787013 2565 scope.go:117] "RemoveContainer" containerID="740042ccc744c173a287265fdd783c14f4e25cfa5624646f07a802b920f5561d" Apr 28 19:16:28.787335 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.787304 2565 scope.go:117] "RemoveContainer" containerID="f2037accf07ad777a8e4761b2f93dea7c27744ed1d6be0720529d85e52d9b34d" Apr 28 19:16:28.787626 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:28.787506 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-p6wm2_openshift-console-operator(f81f5464-aa16-489d-80bf-9e5bf953f7af)\"" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" podUID="f81f5464-aa16-489d-80bf-9e5bf953f7af" Apr 28 19:16:28.823281 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.823242 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbcc\" (UniqueName: \"kubernetes.io/projected/88aaf3ad-9181-4a09-9d18-1650fe56cac6-kube-api-access-xfbcc\") pod \"migrator-74bb7799d9-mk44d\" (UID: \"88aaf3ad-9181-4a09-9d18-1650fe56cac6\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" Apr 28 19:16:28.924648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.924609 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbcc\" (UniqueName: \"kubernetes.io/projected/88aaf3ad-9181-4a09-9d18-1650fe56cac6-kube-api-access-xfbcc\") pod \"migrator-74bb7799d9-mk44d\" (UID: \"88aaf3ad-9181-4a09-9d18-1650fe56cac6\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" Apr 28 19:16:28.943666 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:28.943631 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbcc\" (UniqueName: \"kubernetes.io/projected/88aaf3ad-9181-4a09-9d18-1650fe56cac6-kube-api-access-xfbcc\") pod \"migrator-74bb7799d9-mk44d\" (UID: \"88aaf3ad-9181-4a09-9d18-1650fe56cac6\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" Apr 28 19:16:29.039840 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:29.039803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" Apr 28 19:16:29.178745 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:29.178681 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d"] Apr 28 19:16:29.184544 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:29.184494 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88aaf3ad_9181_4a09_9d18_1650fe56cac6.slice/crio-30f18f6b487d5ac2053c76ac4ba20ee9bacf2bb5a0e4adb93140c965bb9ca0b0 WatchSource:0}: Error finding container 30f18f6b487d5ac2053c76ac4ba20ee9bacf2bb5a0e4adb93140c965bb9ca0b0: Status 404 returned error can't find the container with id 30f18f6b487d5ac2053c76ac4ba20ee9bacf2bb5a0e4adb93140c965bb9ca0b0 Apr 28 19:16:29.792609 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:29.792576 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:16:29.793066 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:29.793018 2565 scope.go:117] "RemoveContainer" containerID="f2037accf07ad777a8e4761b2f93dea7c27744ed1d6be0720529d85e52d9b34d" Apr 28 19:16:29.793274 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:29.793245 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-p6wm2_openshift-console-operator(f81f5464-aa16-489d-80bf-9e5bf953f7af)\"" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" podUID="f81f5464-aa16-489d-80bf-9e5bf953f7af" Apr 28 19:16:29.793854 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:29.793820 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" event={"ID":"88aaf3ad-9181-4a09-9d18-1650fe56cac6","Type":"ContainerStarted","Data":"30f18f6b487d5ac2053c76ac4ba20ee9bacf2bb5a0e4adb93140c965bb9ca0b0"} Apr 28 19:16:30.802127 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.802094 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" event={"ID":"88aaf3ad-9181-4a09-9d18-1650fe56cac6","Type":"ContainerStarted","Data":"f1ec6874a4214bacf414ddf7d33dfe0bda3c35e531924cc06c18256880ebdaa6"} Apr 28 19:16:30.937173 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.937141 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xvbxs"] Apr 28 19:16:30.950202 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.950171 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:30.953490 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.953464 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xvbxs"] Apr 28 19:16:30.954890 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.954867 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 28 19:16:30.955023 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.954878 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 28 19:16:30.956301 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.956282 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 28 19:16:30.956407 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.956337 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-tmpkv\"" Apr 28 19:16:30.956471 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:30.956388 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 28 19:16:31.042816 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.042783 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctwgk\" (UniqueName: \"kubernetes.io/projected/e1db2ddd-22bb-452a-893b-2267caf3faa2-kube-api-access-ctwgk\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.043002 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.042851 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e1db2ddd-22bb-452a-893b-2267caf3faa2-signing-key\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.043002 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.042911 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e1db2ddd-22bb-452a-893b-2267caf3faa2-signing-cabundle\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.143548 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.143514 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctwgk\" (UniqueName: \"kubernetes.io/projected/e1db2ddd-22bb-452a-893b-2267caf3faa2-kube-api-access-ctwgk\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.143716 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.143572 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e1db2ddd-22bb-452a-893b-2267caf3faa2-signing-key\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.143716 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.143598 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e1db2ddd-22bb-452a-893b-2267caf3faa2-signing-cabundle\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.147455 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.147435 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e1db2ddd-22bb-452a-893b-2267caf3faa2-signing-cabundle\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.147519 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.147490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e1db2ddd-22bb-452a-893b-2267caf3faa2-signing-key\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.152966 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.152938 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctwgk\" (UniqueName: \"kubernetes.io/projected/e1db2ddd-22bb-452a-893b-2267caf3faa2-kube-api-access-ctwgk\") pod \"service-ca-865cb79987-xvbxs\" (UID: \"e1db2ddd-22bb-452a-893b-2267caf3faa2\") " pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.183149 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.183125 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9vshb_a521c111-aa4b-4eda-b86b-7b9f76fcd75f/dns-node-resolver/0.log" Apr 28 19:16:31.260047 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.260006 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xvbxs" Apr 28 19:16:31.345588 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.345552 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:31.345588 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.345590 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:31.345815 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.345707 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:31.345815 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.345777 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:31.345815 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.345787 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls podName:a4826c50-2384-48cf-853b-ab348926b6e5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.34576498 +0000 UTC m=+64.279434756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ml5vd" (UID: "a4826c50-2384-48cf-853b-ab348926b6e5") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:16:31.345815 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.345791 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6598fb7d78-gsbrc: secret "image-registry-tls" not found Apr 28 19:16:31.346021 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.345838 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls podName:de9861e6-21f7-4675-9370-4fa19956dd76 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.345822033 +0000 UTC m=+64.279491809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls") pod "image-registry-6598fb7d78-gsbrc" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76") : secret "image-registry-tls" not found Apr 28 19:16:31.377271 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.377240 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xvbxs"] Apr 28 19:16:31.379996 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:31.379959 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1db2ddd_22bb_452a_893b_2267caf3faa2.slice/crio-e7515f0561c87de20c4d43975320878308614714b909e3ebd22e4b733a00faa3 WatchSource:0}: Error finding container e7515f0561c87de20c4d43975320878308614714b909e3ebd22e4b733a00faa3: Status 404 returned error can't find the container with id e7515f0561c87de20c4d43975320878308614714b909e3ebd22e4b733a00faa3 Apr 28 19:16:31.446851 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.446827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:31.446967 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.446913 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:31.447062 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.446990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:31.447062 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.446996 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:31.447163 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.447065 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert podName:42b7e7a8-ff86-48ac-bd59-aab3db697272 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.447045279 +0000 UTC m=+64.380715071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l29s7" (UID: "42b7e7a8-ff86-48ac-bd59-aab3db697272") : secret "networking-console-plugin-cert" not found Apr 28 19:16:31.447163 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.447065 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:31.447163 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.447075 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:16:31.447163 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.447102 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls podName:85a1e12e-6bf9-4548-83dd-a69765a8c24d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.447092399 +0000 UTC m=+64.380762175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls") pod "dns-default-xzw6d" (UID: "85a1e12e-6bf9-4548-83dd-a69765a8c24d") : secret "dns-default-metrics-tls" not found Apr 28 19:16:31.447163 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.447120 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls podName:2e2cbc2f-671d-4390-96cf-52b82b4e889a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.447109388 +0000 UTC m=+64.380779168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-slrjp" (UID: "2e2cbc2f-671d-4390-96cf-52b82b4e889a") : secret "samples-operator-tls" not found Apr 28 19:16:31.547715 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.547685 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:31.547874 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.547739 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:31.547874 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.547851 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:31.547943 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.547875 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:16:31.547943 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.547883 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:31.547943 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.547921 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert podName:cd28326f-9576-4715-a0b0-f6812113130a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.547898802 +0000 UTC m=+64.481568582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert") pod "ingress-canary-t24nc" (UID: "cd28326f-9576-4715-a0b0-f6812113130a") : secret "canary-serving-cert" not found Apr 28 19:16:31.547943 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.547940 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.54793056 +0000 UTC m=+64.481600337 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : secret "router-metrics-certs-default" not found Apr 28 19:16:31.548108 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:31.548001 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle podName:ae143e08-b979-475a-abb9-654fdf653811 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.54796495 +0000 UTC m=+64.481634724 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle") pod "router-default-d5c596596-vxxm2" (UID: "ae143e08-b979-475a-abb9-654fdf653811") : configmap references non-existent config key: service-ca.crt Apr 28 19:16:31.806108 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.806072 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" event={"ID":"88aaf3ad-9181-4a09-9d18-1650fe56cac6","Type":"ContainerStarted","Data":"fe514e0312cdb56dffdecfd628633ccaa3022be7a0625cecdac8f7d87748661d"} Apr 28 19:16:31.807439 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.807415 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xvbxs" event={"ID":"e1db2ddd-22bb-452a-893b-2267caf3faa2","Type":"ContainerStarted","Data":"f92c0eb529862da26b94a6bdb7486593701b82f4581ee4a5d9229c846b60f778"} Apr 28 19:16:31.807520 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.807448 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xvbxs" event={"ID":"e1db2ddd-22bb-452a-893b-2267caf3faa2","Type":"ContainerStarted","Data":"e7515f0561c87de20c4d43975320878308614714b909e3ebd22e4b733a00faa3"} Apr 28 19:16:31.827744 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.827695 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mk44d" podStartSLOduration=2.290846613 podStartE2EDuration="3.827679974s" podCreationTimestamp="2026-04-28 19:16:28 +0000 UTC" firstStartedPulling="2026-04-28 19:16:29.187077116 +0000 UTC m=+46.120746897" lastFinishedPulling="2026-04-28 19:16:30.723910481 +0000 UTC m=+47.657580258" observedRunningTime="2026-04-28 19:16:31.826872606 +0000 UTC m=+48.760542398" watchObservedRunningTime="2026-04-28 19:16:31.827679974 +0000 UTC m=+48.761349771" Apr 28 19:16:31.848599 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:31.848560 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xvbxs" podStartSLOduration=1.848544806 podStartE2EDuration="1.848544806s" podCreationTimestamp="2026-04-28 19:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:31.846882041 +0000 UTC m=+48.780551839" watchObservedRunningTime="2026-04-28 19:16:31.848544806 +0000 UTC m=+48.782214603" Apr 28 19:16:32.175388 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:32.175316 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jn4w7_0b82110c-7495-4dba-b0fc-b29ca1b890f4/node-ca/0.log" Apr 28 19:16:32.988133 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:32.987947 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mk44d_88aaf3ad-9181-4a09-9d18-1650fe56cac6/migrator/0.log" Apr 28 19:16:33.175904 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:33.175841 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mk44d_88aaf3ad-9181-4a09-9d18-1650fe56cac6/graceful-termination/0.log" Apr 28 19:16:33.407486 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:33.407457 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-r5k9t_d08c9e66-fe95-42f4-be54-32ce7e41a44e/kube-storage-version-migrator-operator/0.log" Apr 28 19:16:35.845045 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:35.845000 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:35.845045 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:35.845050 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:35.845650 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:35.845408 2565 scope.go:117] "RemoveContainer" containerID="f2037accf07ad777a8e4761b2f93dea7c27744ed1d6be0720529d85e52d9b34d" Apr 28 19:16:35.845650 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:35.845561 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-p6wm2_openshift-console-operator(f81f5464-aa16-489d-80bf-9e5bf953f7af)\"" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" podUID="f81f5464-aa16-489d-80bf-9e5bf953f7af" Apr 28 19:16:40.663490 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:40.663456 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hkn59" Apr 28 19:16:46.498763 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:46.498730 2565 scope.go:117] "RemoveContainer" containerID="f2037accf07ad777a8e4761b2f93dea7c27744ed1d6be0720529d85e52d9b34d" Apr 28 19:16:46.844865 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:46.844833 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:16:46.845038 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:46.844903 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" event={"ID":"f81f5464-aa16-489d-80bf-9e5bf953f7af","Type":"ContainerStarted","Data":"3b4e0993d1d46e36319c499a71d98dcc1daeedc6d5ac4df2c42eecdf367879a9"} Apr 28 19:16:46.845288 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:46.845267 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:46.865560 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:46.865515 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" podStartSLOduration=51.661731638 podStartE2EDuration="1m0.865480114s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:17.828363014 +0000 UTC m=+34.762032794" lastFinishedPulling="2026-04-28 19:16:27.032111482 +0000 UTC m=+43.965781270" observedRunningTime="2026-04-28 19:16:46.86536218 +0000 UTC m=+63.799031975" watchObservedRunningTime="2026-04-28 19:16:46.865480114 +0000 UTC m=+63.799149892" Apr 28 19:16:47.107355 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.107283 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-p6wm2" Apr 28 19:16:47.389641 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.389557 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:47.389641 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.389596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:47.391952 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.391924 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"image-registry-6598fb7d78-gsbrc\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:47.392077 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.392031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4826c50-2384-48cf-853b-ab348926b6e5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ml5vd\" (UID: \"a4826c50-2384-48cf-853b-ab348926b6e5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:47.490761 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.490721 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:47.490925 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.490819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:47.490925 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.490849 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:47.493027 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.492997 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85a1e12e-6bf9-4548-83dd-a69765a8c24d-metrics-tls\") pod \"dns-default-xzw6d\" (UID: \"85a1e12e-6bf9-4548-83dd-a69765a8c24d\") " pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:47.493231 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.493212 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b7e7a8-ff86-48ac-bd59-aab3db697272-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l29s7\" (UID: \"42b7e7a8-ff86-48ac-bd59-aab3db697272\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:47.493299 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.493251 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e2cbc2f-671d-4390-96cf-52b82b4e889a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-slrjp\" (UID: \"2e2cbc2f-671d-4390-96cf-52b82b4e889a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:47.562272 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.562245 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b9m64\"" Apr 28 19:16:47.570164 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.570147 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:47.575397 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.575375 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zgpqd\"" Apr 28 19:16:47.583889 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.583865 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" Apr 28 19:16:47.592005 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.591957 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:47.592118 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.592021 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:47.592118 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.592073 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:47.592532 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.592505 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae143e08-b979-475a-abb9-654fdf653811-service-ca-bundle\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:47.594841 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.594795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd28326f-9576-4715-a0b0-f6812113130a-cert\") pod \"ingress-canary-t24nc\" (UID: \"cd28326f-9576-4715-a0b0-f6812113130a\") " pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:47.595237 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.595213 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae143e08-b979-475a-abb9-654fdf653811-metrics-certs\") pod \"router-default-d5c596596-vxxm2\" (UID: \"ae143e08-b979-475a-abb9-654fdf653811\") " pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:47.663563 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.663534 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g5crm\"" Apr 28 19:16:47.670712 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.670690 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" Apr 28 19:16:47.701702 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.701676 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xdh6h\"" Apr 28 19:16:47.709726 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.709693 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:47.715625 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.715584 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6598fb7d78-gsbrc"] Apr 28 19:16:47.717441 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.717337 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-t8rzl\"" Apr 28 19:16:47.721376 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:47.721346 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde9861e6_21f7_4675_9370_4fa19956dd76.slice/crio-57a7b399bacbeaa97eef5a704b3a73a50b0e8200dc70f43e22b809c57679f969 WatchSource:0}: Error finding container 57a7b399bacbeaa97eef5a704b3a73a50b0e8200dc70f43e22b809c57679f969: Status 404 returned error can't find the container with id 57a7b399bacbeaa97eef5a704b3a73a50b0e8200dc70f43e22b809c57679f969 Apr 28 19:16:47.722947 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.722926 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" Apr 28 19:16:47.742909 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.742866 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd"] Apr 28 19:16:47.746170 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:47.746139 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4826c50_2384_48cf_853b_ab348926b6e5.slice/crio-b4c76453f64e70d292f850f615dcd3a3bc24913a38d20a424558680273de4a8f WatchSource:0}: Error finding container b4c76453f64e70d292f850f615dcd3a3bc24913a38d20a424558680273de4a8f: Status 404 returned error can't find the container with id b4c76453f64e70d292f850f615dcd3a3bc24913a38d20a424558680273de4a8f Apr 28 19:16:47.751274 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.751253 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-d8gn2\"" Apr 28 19:16:47.758838 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.758272 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:47.765043 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.764934 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qbgzj\"" Apr 28 19:16:47.772678 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.772480 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t24nc" Apr 28 19:16:47.830255 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.830171 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp"] Apr 28 19:16:47.849519 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.849451 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" event={"ID":"de9861e6-21f7-4675-9370-4fa19956dd76","Type":"ContainerStarted","Data":"57a7b399bacbeaa97eef5a704b3a73a50b0e8200dc70f43e22b809c57679f969"} Apr 28 19:16:47.851520 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.851446 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" event={"ID":"a4826c50-2384-48cf-853b-ab348926b6e5","Type":"ContainerStarted","Data":"b4c76453f64e70d292f850f615dcd3a3bc24913a38d20a424558680273de4a8f"} Apr 28 19:16:47.882012 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.881853 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xzw6d"] Apr 28 19:16:47.902541 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.902452 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-l29s7"] Apr 28 19:16:47.920968 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:47.913188 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85a1e12e_6bf9_4548_83dd_a69765a8c24d.slice/crio-33f9c95478332f19dbc2c6293ef06bbb59a8a267894db81b1b0bbb2aeea20397 WatchSource:0}: Error finding container 33f9c95478332f19dbc2c6293ef06bbb59a8a267894db81b1b0bbb2aeea20397: Status 404 returned error can't find the container with id 33f9c95478332f19dbc2c6293ef06bbb59a8a267894db81b1b0bbb2aeea20397 Apr 28 19:16:47.969965 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.969823 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d5c596596-vxxm2"] Apr 28 19:16:47.972413 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:47.972381 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae143e08_b979_475a_abb9_654fdf653811.slice/crio-9c395fc557adcaaaa579a2fc1064b998021905dd28069f9deeaf9a49919b08c3 WatchSource:0}: Error finding container 9c395fc557adcaaaa579a2fc1064b998021905dd28069f9deeaf9a49919b08c3: Status 404 returned error can't find the container with id 9c395fc557adcaaaa579a2fc1064b998021905dd28069f9deeaf9a49919b08c3 Apr 28 19:16:47.972738 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:47.972711 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t24nc"] Apr 28 19:16:47.976481 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:47.976460 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd28326f_9576_4715_a0b0_f6812113130a.slice/crio-0e4adcb01bb86b637b9434b9ceb398581379cea9abf1da4a34021416538796ab WatchSource:0}: Error finding container 0e4adcb01bb86b637b9434b9ceb398581379cea9abf1da4a34021416538796ab: Status 404 returned error can't find the container with id 0e4adcb01bb86b637b9434b9ceb398581379cea9abf1da4a34021416538796ab Apr 28 19:16:48.858717 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.858608 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" event={"ID":"de9861e6-21f7-4675-9370-4fa19956dd76","Type":"ContainerStarted","Data":"ce2742b470d9896fe6f12ab6711e51a7ae7a6cd8a25f1493798f48772ec24ee1"} Apr 28 19:16:48.858717 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.858667 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:16:48.863950 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.863925 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t24nc" event={"ID":"cd28326f-9576-4715-a0b0-f6812113130a","Type":"ContainerStarted","Data":"0e4adcb01bb86b637b9434b9ceb398581379cea9abf1da4a34021416538796ab"} Apr 28 19:16:48.867476 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.867450 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xzw6d" event={"ID":"85a1e12e-6bf9-4548-83dd-a69765a8c24d","Type":"ContainerStarted","Data":"33f9c95478332f19dbc2c6293ef06bbb59a8a267894db81b1b0bbb2aeea20397"} Apr 28 19:16:48.869945 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.869894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" event={"ID":"42b7e7a8-ff86-48ac-bd59-aab3db697272","Type":"ContainerStarted","Data":"07dddf5fe8a42e0262aefc1c57e8c1d23331cd2af020f7fbc923aaba7a311529"} Apr 28 19:16:48.872261 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.871714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d5c596596-vxxm2" event={"ID":"ae143e08-b979-475a-abb9-654fdf653811","Type":"ContainerStarted","Data":"f99e198da546506084d5b617dd1da83feceef23a04ce6250c12c75e7e782aedf"} Apr 28 19:16:48.872261 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.871739 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d5c596596-vxxm2" event={"ID":"ae143e08-b979-475a-abb9-654fdf653811","Type":"ContainerStarted","Data":"9c395fc557adcaaaa579a2fc1064b998021905dd28069f9deeaf9a49919b08c3"} Apr 28 19:16:48.875806 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.875743 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" event={"ID":"2e2cbc2f-671d-4390-96cf-52b82b4e889a","Type":"ContainerStarted","Data":"ae25a1412eccd4540adfbc133f4f4b4f5106958447176b6e72abab1dbb1e4f1b"} Apr 28 19:16:48.883788 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:48.883741 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" podStartSLOduration=64.883726665 podStartE2EDuration="1m4.883726665s" podCreationTimestamp="2026-04-28 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:48.880592646 +0000 UTC m=+65.814262444" watchObservedRunningTime="2026-04-28 19:16:48.883726665 +0000 UTC m=+65.817396462" Apr 28 19:16:49.210538 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.210442 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:49.214667 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.214641 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a559869f-dc8c-4397-aa54-b59c274faa74-metrics-certs\") pod \"network-metrics-daemon-nfhpq\" (UID: \"a559869f-dc8c-4397-aa54-b59c274faa74\") " pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:49.440763 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.440543 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6mdm8\"" Apr 28 19:16:49.449025 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.448853 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfhpq" Apr 28 19:16:49.611815 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.611760 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d5c596596-vxxm2" podStartSLOduration=62.611739041 podStartE2EDuration="1m2.611739041s" podCreationTimestamp="2026-04-28 19:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:48.90391286 +0000 UTC m=+65.837582657" watchObservedRunningTime="2026-04-28 19:16:49.611739041 +0000 UTC m=+66.545408840" Apr 28 19:16:49.612027 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.612011 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nfhpq"] Apr 28 19:16:49.759763 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.759722 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:49.762332 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.762092 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:49.878828 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.878793 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:49.880172 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:49.880149 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d5c596596-vxxm2" Apr 28 19:16:50.266701 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:50.266673 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda559869f_dc8c_4397_aa54_b59c274faa74.slice/crio-005e426e6e979c9ac1518be2c511cd71348674f1a2eea43decb27c3f49edff41 WatchSource:0}: Error finding container 005e426e6e979c9ac1518be2c511cd71348674f1a2eea43decb27c3f49edff41: Status 404 returned error can't find the container with id 005e426e6e979c9ac1518be2c511cd71348674f1a2eea43decb27c3f49edff41 Apr 28 19:16:50.882874 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:50.882837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nfhpq" event={"ID":"a559869f-dc8c-4397-aa54-b59c274faa74","Type":"ContainerStarted","Data":"005e426e6e979c9ac1518be2c511cd71348674f1a2eea43decb27c3f49edff41"} Apr 28 19:16:51.898052 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.897829 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t24nc" event={"ID":"cd28326f-9576-4715-a0b0-f6812113130a","Type":"ContainerStarted","Data":"8e0714a4646be0bc4ef9665a02b4de26afa9ee968cc32cb25f71aa91993a7c16"} Apr 28 19:16:51.902722 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.902656 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xzw6d" event={"ID":"85a1e12e-6bf9-4548-83dd-a69765a8c24d","Type":"ContainerStarted","Data":"6c0396d543c2554eab1d6f4417decbffb90d49e1293bf0c4b17868c3abd185d9"} Apr 28 19:16:51.908921 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.908416 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" event={"ID":"42b7e7a8-ff86-48ac-bd59-aab3db697272","Type":"ContainerStarted","Data":"c68985e91996d7ce155f5ca761100ee4f52b5dadbe566007d135475a0b30755d"} Apr 28 19:16:51.912043 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.911513 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" event={"ID":"2e2cbc2f-671d-4390-96cf-52b82b4e889a","Type":"ContainerStarted","Data":"1d0d1314c540ee8e0446289a411a7fcd9d1464b627d9ed981d633f425dc4b424"} Apr 28 19:16:51.912043 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.911545 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" event={"ID":"2e2cbc2f-671d-4390-96cf-52b82b4e889a","Type":"ContainerStarted","Data":"da4a0207c5d5c3f0fa028fdff9be1b747f7426224a3eb8e7e92eac5d1620f2b0"} Apr 28 19:16:51.916359 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.916270 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" event={"ID":"a4826c50-2384-48cf-853b-ab348926b6e5","Type":"ContainerStarted","Data":"6737ff6179ff31d70c2912787ef09d6c7c8090717c420c248f18fa685d832ae9"} Apr 28 19:16:51.943552 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.943342 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l29s7" podStartSLOduration=61.227830787 podStartE2EDuration="1m4.943324194s" podCreationTimestamp="2026-04-28 19:15:47 +0000 UTC" firstStartedPulling="2026-04-28 19:16:47.930690062 +0000 UTC m=+64.864359844" lastFinishedPulling="2026-04-28 19:16:51.646183458 +0000 UTC m=+68.579853251" observedRunningTime="2026-04-28 19:16:51.94200285 +0000 UTC m=+68.875672644" watchObservedRunningTime="2026-04-28 19:16:51.943324194 +0000 UTC m=+68.876993992" Apr 28 19:16:51.943768 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.943725 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t24nc" podStartSLOduration=33.270113915 podStartE2EDuration="36.943716366s" podCreationTimestamp="2026-04-28 19:16:15 +0000 UTC" firstStartedPulling="2026-04-28 19:16:47.978078659 +0000 UTC m=+64.911748434" lastFinishedPulling="2026-04-28 19:16:51.651681108 +0000 UTC m=+68.585350885" observedRunningTime="2026-04-28 19:16:51.921841395 +0000 UTC m=+68.855511192" watchObservedRunningTime="2026-04-28 19:16:51.943716366 +0000 UTC m=+68.877386163" Apr 28 19:16:51.962813 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.962695 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ml5vd" podStartSLOduration=62.063364906 podStartE2EDuration="1m5.962675767s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:47.748055099 +0000 UTC m=+64.681724873" lastFinishedPulling="2026-04-28 19:16:51.647365956 +0000 UTC m=+68.581035734" observedRunningTime="2026-04-28 19:16:51.961760714 +0000 UTC m=+68.895430511" watchObservedRunningTime="2026-04-28 19:16:51.962675767 +0000 UTC m=+68.896345564" Apr 28 19:16:51.980504 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:51.979425 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-slrjp" podStartSLOduration=62.23610462 podStartE2EDuration="1m5.979407624s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:47.909271685 +0000 UTC m=+64.842941474" lastFinishedPulling="2026-04-28 19:16:51.652574683 +0000 UTC m=+68.586244478" observedRunningTime="2026-04-28 19:16:51.979009564 +0000 UTC m=+68.912679354" watchObservedRunningTime="2026-04-28 19:16:51.979407624 +0000 UTC m=+68.913077420" Apr 28 19:16:52.885605 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.885576 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ntdxj"] Apr 28 19:16:52.888819 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.888797 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:16:52.893315 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.893287 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-8mpn6\"" Apr 28 19:16:52.893605 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.893589 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:16:52.893948 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.893933 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:16:52.910868 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.910835 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ntdxj"] Apr 28 19:16:52.921830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.921755 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nfhpq" event={"ID":"a559869f-dc8c-4397-aa54-b59c274faa74","Type":"ContainerStarted","Data":"09a9e3086fcc0a5845183824b05fa9aeb51e9830f5ec8fe3c08935027dd038e1"} Apr 28 19:16:52.921830 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.921800 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nfhpq" event={"ID":"a559869f-dc8c-4397-aa54-b59c274faa74","Type":"ContainerStarted","Data":"e8a8a6d1a0a4acffd408fa9e4b10ac0226416d5ec1b3c5aaa1fdee69576bb398"} Apr 28 19:16:52.923523 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.923492 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xzw6d" event={"ID":"85a1e12e-6bf9-4548-83dd-a69765a8c24d","Type":"ContainerStarted","Data":"a80da9bf2ed844b5b78e043122424f676efd8ba207d8a42341e1af44375c115d"} Apr 28 19:16:52.924036 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.924017 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xzw6d" Apr 28 19:16:52.979190 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.979153 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hvxg8"] Apr 28 19:16:52.979688 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.979645 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nfhpq" podStartSLOduration=67.810406201 podStartE2EDuration="1m9.979629545s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:16:50.2944727 +0000 UTC m=+67.228142480" lastFinishedPulling="2026-04-28 19:16:52.46369605 +0000 UTC m=+69.397365824" observedRunningTime="2026-04-28 19:16:52.974852091 +0000 UTC m=+69.908521886" watchObservedRunningTime="2026-04-28 19:16:52.979629545 +0000 UTC m=+69.913299341" Apr 28 19:16:52.982487 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.982469 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:52.986472 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.986451 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tpf57\"" Apr 28 19:16:52.989493 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.989433 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:16:52.989493 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:52.989452 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:16:53.003834 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.003808 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hvxg8"] Apr 28 19:16:53.016114 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.016083 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj"] Apr 28 19:16:53.019083 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.019069 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:53.043816 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.043772 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xzw6d" podStartSLOduration=34.317272739 podStartE2EDuration="38.043758716s" podCreationTimestamp="2026-04-28 19:16:15 +0000 UTC" firstStartedPulling="2026-04-28 19:16:47.921440081 +0000 UTC m=+64.855109862" lastFinishedPulling="2026-04-28 19:16:51.647926065 +0000 UTC m=+68.581595839" observedRunningTime="2026-04-28 19:16:53.039577453 +0000 UTC m=+69.973247249" watchObservedRunningTime="2026-04-28 19:16:53.043758716 +0000 UTC m=+69.977428511" Apr 28 19:16:53.044255 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.044236 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-rjzgl\"" Apr 28 19:16:53.044368 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.044351 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 28 19:16:53.045147 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.045131 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhnpr\" (UniqueName: \"kubernetes.io/projected/828ffceb-04a7-4751-b3fd-abe2a2db01c5-kube-api-access-rhnpr\") pod \"downloads-6bcc868b7-ntdxj\" (UID: \"828ffceb-04a7-4751-b3fd-abe2a2db01c5\") " pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:16:53.045396 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.045379 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/94af290b-e61d-41bf-b802-a973593865a7-data-volume\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.045483 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.045457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/94af290b-e61d-41bf-b802-a973593865a7-crio-socket\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.045545 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.045516 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7sf\" (UniqueName: \"kubernetes.io/projected/94af290b-e61d-41bf-b802-a973593865a7-kube-api-access-zb7sf\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.045648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.045630 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/94af290b-e61d-41bf-b802-a973593865a7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.045701 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.045686 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/94af290b-e61d-41bf-b802-a973593865a7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.084519 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.084487 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj"] Apr 28 19:16:53.147310 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147220 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7sf\" (UniqueName: \"kubernetes.io/projected/94af290b-e61d-41bf-b802-a973593865a7-kube-api-access-zb7sf\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147310 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147280 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/94af290b-e61d-41bf-b802-a973593865a7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147310 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/94af290b-e61d-41bf-b802-a973593865a7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147597 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147346 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3f4c4045-dbe3-4ff4-866e-b265cc183438-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9b4wj\" (UID: \"3f4c4045-dbe3-4ff4-866e-b265cc183438\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:53.147597 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147380 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhnpr\" (UniqueName: \"kubernetes.io/projected/828ffceb-04a7-4751-b3fd-abe2a2db01c5-kube-api-access-rhnpr\") pod \"downloads-6bcc868b7-ntdxj\" (UID: \"828ffceb-04a7-4751-b3fd-abe2a2db01c5\") " pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:16:53.147597 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147462 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/94af290b-e61d-41bf-b802-a973593865a7-data-volume\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147597 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147512 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/94af290b-e61d-41bf-b802-a973593865a7-crio-socket\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147744 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147608 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/94af290b-e61d-41bf-b802-a973593865a7-crio-socket\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147795 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147778 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/94af290b-e61d-41bf-b802-a973593865a7-data-volume\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.147855 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.147838 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/94af290b-e61d-41bf-b802-a973593865a7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.149751 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.149731 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/94af290b-e61d-41bf-b802-a973593865a7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.166893 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.166869 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhnpr\" (UniqueName: \"kubernetes.io/projected/828ffceb-04a7-4751-b3fd-abe2a2db01c5-kube-api-access-rhnpr\") pod \"downloads-6bcc868b7-ntdxj\" (UID: \"828ffceb-04a7-4751-b3fd-abe2a2db01c5\") " pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:16:53.168413 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.168391 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7sf\" (UniqueName: \"kubernetes.io/projected/94af290b-e61d-41bf-b802-a973593865a7-kube-api-access-zb7sf\") pod \"insights-runtime-extractor-hvxg8\" (UID: \"94af290b-e61d-41bf-b802-a973593865a7\") " pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.198419 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.198385 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:16:53.249056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.248838 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3f4c4045-dbe3-4ff4-866e-b265cc183438-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9b4wj\" (UID: \"3f4c4045-dbe3-4ff4-866e-b265cc183438\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:53.253156 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.253096 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3f4c4045-dbe3-4ff4-866e-b265cc183438-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9b4wj\" (UID: \"3f4c4045-dbe3-4ff4-866e-b265cc183438\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:53.291949 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.291913 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hvxg8" Apr 28 19:16:53.327158 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.327119 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:53.343303 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.343231 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ntdxj"] Apr 28 19:16:53.347198 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:53.347160 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828ffceb_04a7_4751_b3fd_abe2a2db01c5.slice/crio-e29505b9c7a6b8854e8c618296b3741cfd781314b477e7221999248f0dc2c35e WatchSource:0}: Error finding container e29505b9c7a6b8854e8c618296b3741cfd781314b477e7221999248f0dc2c35e: Status 404 returned error can't find the container with id e29505b9c7a6b8854e8c618296b3741cfd781314b477e7221999248f0dc2c35e Apr 28 19:16:53.441205 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.441174 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hvxg8"] Apr 28 19:16:53.444123 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:53.444089 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94af290b_e61d_41bf_b802_a973593865a7.slice/crio-49e735a8faf653176b3c5148f48279d822817e156dc7f6aedf868b899bfd7589 WatchSource:0}: Error finding container 49e735a8faf653176b3c5148f48279d822817e156dc7f6aedf868b899bfd7589: Status 404 returned error can't find the container with id 49e735a8faf653176b3c5148f48279d822817e156dc7f6aedf868b899bfd7589 Apr 28 19:16:53.512582 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.512462 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj"] Apr 28 19:16:53.928326 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.928287 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hvxg8" event={"ID":"94af290b-e61d-41bf-b802-a973593865a7","Type":"ContainerStarted","Data":"9882151f3dc6e5c2ab601d20343b706af871f779ac5bb9b6135d197a7482929c"} Apr 28 19:16:53.928326 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.928332 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hvxg8" event={"ID":"94af290b-e61d-41bf-b802-a973593865a7","Type":"ContainerStarted","Data":"49e735a8faf653176b3c5148f48279d822817e156dc7f6aedf868b899bfd7589"} Apr 28 19:16:53.929577 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.929550 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ntdxj" event={"ID":"828ffceb-04a7-4751-b3fd-abe2a2db01c5","Type":"ContainerStarted","Data":"e29505b9c7a6b8854e8c618296b3741cfd781314b477e7221999248f0dc2c35e"} Apr 28 19:16:53.932228 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:53.931509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" event={"ID":"3f4c4045-dbe3-4ff4-866e-b265cc183438","Type":"ContainerStarted","Data":"81c0413e55bcb690286d3836f116c58187431f6542400c5456b5054c7419b951"} Apr 28 19:16:54.937425 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:54.936945 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" event={"ID":"3f4c4045-dbe3-4ff4-866e-b265cc183438","Type":"ContainerStarted","Data":"4aa6bc9bc1393624c9fd8d66c2439dc66e95239c38b804d986cf52ec80f60e24"} Apr 28 19:16:54.937425 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:54.937349 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:54.939481 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:54.939454 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hvxg8" event={"ID":"94af290b-e61d-41bf-b802-a973593865a7","Type":"ContainerStarted","Data":"c78780cd3072fe3130c7c386ff514ddefaa8bc5c46d8abd4cb14a731c2ef361c"} Apr 28 19:16:54.942996 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:54.942947 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" Apr 28 19:16:54.967914 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:54.967863 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9b4wj" podStartSLOduration=1.6986895610000001 podStartE2EDuration="2.967849661s" podCreationTimestamp="2026-04-28 19:16:52 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.513021974 +0000 UTC m=+70.446691766" lastFinishedPulling="2026-04-28 19:16:54.782182086 +0000 UTC m=+71.715851866" observedRunningTime="2026-04-28 19:16:54.9675018 +0000 UTC m=+71.901171597" watchObservedRunningTime="2026-04-28 19:16:54.967849661 +0000 UTC m=+71.901519457" Apr 28 19:16:55.360654 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.360576 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-97v2f"] Apr 28 19:16:55.365132 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.365109 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.368230 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.368208 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 28 19:16:55.369110 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.369092 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-fsphw\"" Apr 28 19:16:55.369231 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.369138 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 28 19:16:55.369331 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.369290 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:16:55.379918 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.379897 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-97v2f"] Apr 28 19:16:55.469134 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.469098 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f26ab95f-87b7-46a9-a5db-7a5590ec350d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.469275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.469158 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.469275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.469187 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.469275 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.469237 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8szk\" (UniqueName: \"kubernetes.io/projected/f26ab95f-87b7-46a9-a5db-7a5590ec350d-kube-api-access-b8szk\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.570135 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.570086 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.570308 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.570175 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8szk\" (UniqueName: \"kubernetes.io/projected/f26ab95f-87b7-46a9-a5db-7a5590ec350d-kube-api-access-b8szk\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.570308 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.570237 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f26ab95f-87b7-46a9-a5db-7a5590ec350d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.570308 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.570286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.570480 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:55.570396 2565 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 28 19:16:55.570480 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:16:55.570458 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-tls podName:f26ab95f-87b7-46a9-a5db-7a5590ec350d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:56.070437526 +0000 UTC m=+73.004107308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-97v2f" (UID: "f26ab95f-87b7-46a9-a5db-7a5590ec350d") : secret "prometheus-operator-tls" not found Apr 28 19:16:55.571053 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.571024 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f26ab95f-87b7-46a9-a5db-7a5590ec350d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.572688 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.572666 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.587966 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.587934 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8szk\" (UniqueName: \"kubernetes.io/projected/f26ab95f-87b7-46a9-a5db-7a5590ec350d-kube-api-access-b8szk\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:55.944806 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.944716 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hvxg8" event={"ID":"94af290b-e61d-41bf-b802-a973593865a7","Type":"ContainerStarted","Data":"18dc1dd4ea0a830325665e21f8374c97b87ca861cb3058b17aeca1be7cae3243"} Apr 28 19:16:55.975747 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:55.975687 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hvxg8" podStartSLOduration=1.860521781 podStartE2EDuration="3.975671349s" podCreationTimestamp="2026-04-28 19:16:52 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.512169242 +0000 UTC m=+70.445839022" lastFinishedPulling="2026-04-28 19:16:55.627318815 +0000 UTC m=+72.560988590" observedRunningTime="2026-04-28 19:16:55.974180846 +0000 UTC m=+72.907850645" watchObservedRunningTime="2026-04-28 19:16:55.975671349 +0000 UTC m=+72.909341144" Apr 28 19:16:56.074553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.074507 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:56.077313 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.077277 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f26ab95f-87b7-46a9-a5db-7a5590ec350d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-97v2f\" (UID: \"f26ab95f-87b7-46a9-a5db-7a5590ec350d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:56.274309 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.274279 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd6fd49dd-6l6vk"] Apr 28 19:16:56.278163 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.278138 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" Apr 28 19:16:56.313377 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.313342 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd6fd49dd-6l6vk"] Apr 28 19:16:56.313536 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.313491 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.317312 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.317236 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:16:56.317719 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.317700 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:16:56.317819 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.317700 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:16:56.317819 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.317758 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ffxhh\"" Apr 28 19:16:56.318342 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.318320 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:16:56.318446 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.318421 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:16:56.415765 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.415737 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-97v2f"] Apr 28 19:16:56.418013 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:56.417966 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26ab95f_87b7_46a9_a5db_7a5590ec350d.slice/crio-64eb97c39c5ded8952a3a5597222fa531a15e2a0206af6b61fbb98aa92ce6aa2 WatchSource:0}: Error finding container 64eb97c39c5ded8952a3a5597222fa531a15e2a0206af6b61fbb98aa92ce6aa2: Status 404 returned error can't find the container with id 64eb97c39c5ded8952a3a5597222fa531a15e2a0206af6b61fbb98aa92ce6aa2 Apr 28 19:16:56.477487 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.477452 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-oauth-config\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.477673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.477510 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-serving-cert\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.477673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.477541 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-oauth-serving-cert\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.477673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.477575 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-console-config\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.477673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.477614 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxkj\" (UniqueName: \"kubernetes.io/projected/932db50e-415a-4998-8daf-414ec4148fdb-kube-api-access-9bxkj\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.477673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.477631 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-service-ca\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578175 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578083 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-serving-cert\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578175 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578133 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-oauth-serving-cert\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578389 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578326 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-console-config\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578389 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578379 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxkj\" (UniqueName: \"kubernetes.io/projected/932db50e-415a-4998-8daf-414ec4148fdb-kube-api-access-9bxkj\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578491 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-service-ca\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578491 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578462 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-oauth-config\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578855 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578829 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-oauth-serving-cert\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.578962 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.578948 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-console-config\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.579423 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.579399 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-service-ca\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.581104 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.581076 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-serving-cert\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.581312 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.581288 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-oauth-config\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.587052 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.587022 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxkj\" (UniqueName: \"kubernetes.io/projected/932db50e-415a-4998-8daf-414ec4148fdb-kube-api-access-9bxkj\") pod \"console-7cd6fd49dd-6l6vk\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.624046 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.624011 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:16:56.761673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.761629 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd6fd49dd-6l6vk"] Apr 28 19:16:56.766529 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:16:56.766497 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod932db50e_415a_4998_8daf_414ec4148fdb.slice/crio-27c6750d4d24c078cf9c3110d8fa3d6b4928ea961c3cbce743513cdeee8dc03e WatchSource:0}: Error finding container 27c6750d4d24c078cf9c3110d8fa3d6b4928ea961c3cbce743513cdeee8dc03e: Status 404 returned error can't find the container with id 27c6750d4d24c078cf9c3110d8fa3d6b4928ea961c3cbce743513cdeee8dc03e Apr 28 19:16:56.950301 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.950200 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd6fd49dd-6l6vk" event={"ID":"932db50e-415a-4998-8daf-414ec4148fdb","Type":"ContainerStarted","Data":"27c6750d4d24c078cf9c3110d8fa3d6b4928ea961c3cbce743513cdeee8dc03e"} Apr 28 19:16:56.951836 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:56.951802 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" event={"ID":"f26ab95f-87b7-46a9-a5db-7a5590ec350d","Type":"ContainerStarted","Data":"64eb97c39c5ded8952a3a5597222fa531a15e2a0206af6b61fbb98aa92ce6aa2"} Apr 28 19:16:58.790207 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:58.790178 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zj9qs" Apr 28 19:16:58.960241 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:58.960207 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" event={"ID":"f26ab95f-87b7-46a9-a5db-7a5590ec350d","Type":"ContainerStarted","Data":"93b6534f3472fb5c04a1209a93cbbf0db56fe22cd6d537a61c3a7f8f680d7a93"} Apr 28 19:16:58.960241 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:16:58.960246 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" event={"ID":"f26ab95f-87b7-46a9-a5db-7a5590ec350d","Type":"ContainerStarted","Data":"afea8388de6b1f6b7a30865cecad192b137f9d5b9cdd0757543fff4e2a2b2cfc"} Apr 28 19:17:00.771200 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.770303 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-97v2f" podStartSLOduration=4.003176857 podStartE2EDuration="5.770283135s" podCreationTimestamp="2026-04-28 19:16:55 +0000 UTC" firstStartedPulling="2026-04-28 19:16:56.420323179 +0000 UTC m=+73.353992960" lastFinishedPulling="2026-04-28 19:16:58.18742946 +0000 UTC m=+75.121099238" observedRunningTime="2026-04-28 19:16:59.015281046 +0000 UTC m=+75.948950872" watchObservedRunningTime="2026-04-28 19:17:00.770283135 +0000 UTC m=+77.703952932" Apr 28 19:17:00.772900 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.772801 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jq5hc"] Apr 28 19:17:00.792384 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.790831 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.793882 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.793561 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:17:00.793882 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.793785 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:17:00.794668 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.794255 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x74v7\"" Apr 28 19:17:00.794668 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.794489 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:17:00.920300 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920255 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920481 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920310 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-sys\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920481 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920339 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xmp6\" (UniqueName: \"kubernetes.io/projected/93caaf59-62bc-4dc7-bb6d-81120b7144cb-kube-api-access-2xmp6\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920481 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920372 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93caaf59-62bc-4dc7-bb6d-81120b7144cb-metrics-client-ca\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920481 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920406 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-accelerators-collector-config\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920481 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920458 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-root\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920740 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920492 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-wtmp\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920740 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920522 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-textfile\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.920740 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.920544 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-tls\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:00.981648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:00.981512 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd6fd49dd-6l6vk" event={"ID":"932db50e-415a-4998-8daf-414ec4148fdb","Type":"ContainerStarted","Data":"471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2"} Apr 28 19:17:01.021658 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.021579 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.021947 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.021906 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-sys\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.022117 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.022077 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-sys\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.022634 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.022577 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xmp6\" (UniqueName: \"kubernetes.io/projected/93caaf59-62bc-4dc7-bb6d-81120b7144cb-kube-api-access-2xmp6\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.022832 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.022790 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93caaf59-62bc-4dc7-bb6d-81120b7144cb-metrics-client-ca\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.023077 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.023027 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-accelerators-collector-config\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.023326 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.023227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-root\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.023326 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.023287 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-wtmp\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.023537 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.023482 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-textfile\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.025109 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.023512 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-tls\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.025389 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.024057 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-root\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.025389 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.024443 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.025389 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.024557 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93caaf59-62bc-4dc7-bb6d-81120b7144cb-metrics-client-ca\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.025639 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.024562 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-wtmp\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.025639 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.024997 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-accelerators-collector-config\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.028451 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.028408 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-textfile\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.030421 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.030367 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93caaf59-62bc-4dc7-bb6d-81120b7144cb-node-exporter-tls\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.039492 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.039419 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xmp6\" (UniqueName: \"kubernetes.io/projected/93caaf59-62bc-4dc7-bb6d-81120b7144cb-kube-api-access-2xmp6\") pod \"node-exporter-jq5hc\" (UID: \"93caaf59-62bc-4dc7-bb6d-81120b7144cb\") " pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.130053 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.130018 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jq5hc" Apr 28 19:17:01.985477 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:01.985445 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jq5hc" event={"ID":"93caaf59-62bc-4dc7-bb6d-81120b7144cb","Type":"ContainerStarted","Data":"4906c797297f1b1998fda73759f7febce357ce4d030b6918b4356d7760b34619"} Apr 28 19:17:02.934912 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:02.934831 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xzw6d" Apr 28 19:17:02.956889 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:02.956834 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd6fd49dd-6l6vk" podStartSLOduration=2.979843252 podStartE2EDuration="6.956814043s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:56.768902193 +0000 UTC m=+73.702571967" lastFinishedPulling="2026-04-28 19:17:00.745872976 +0000 UTC m=+77.679542758" observedRunningTime="2026-04-28 19:17:01.009892238 +0000 UTC m=+77.943562041" watchObservedRunningTime="2026-04-28 19:17:02.956814043 +0000 UTC m=+79.890483840" Apr 28 19:17:03.014408 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:03.014365 2565 generic.go:358] "Generic (PLEG): container finished" podID="93caaf59-62bc-4dc7-bb6d-81120b7144cb" containerID="0795338f20a1175272e1487967ca7ce8c927462e1e4ed04c2987b741fa522db8" exitCode=0 Apr 28 19:17:03.015943 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:03.015816 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jq5hc" event={"ID":"93caaf59-62bc-4dc7-bb6d-81120b7144cb","Type":"ContainerDied","Data":"0795338f20a1175272e1487967ca7ce8c927462e1e4ed04c2987b741fa522db8"} Apr 28 19:17:04.022055 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:04.022015 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jq5hc" event={"ID":"93caaf59-62bc-4dc7-bb6d-81120b7144cb","Type":"ContainerStarted","Data":"5398dc551d67eef0692168a7beba9c66ea3d3eadd053571c5371e43680ba4151"} Apr 28 19:17:04.022055 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:04.022055 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jq5hc" event={"ID":"93caaf59-62bc-4dc7-bb6d-81120b7144cb","Type":"ContainerStarted","Data":"26c82310a59fc6a924f994865ec5954a4fbb40d5b0965f65d26abfd610a2f87b"} Apr 28 19:17:04.043932 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:04.043881 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jq5hc" podStartSLOduration=2.835490862 podStartE2EDuration="4.043862634s" podCreationTimestamp="2026-04-28 19:17:00 +0000 UTC" firstStartedPulling="2026-04-28 19:17:01.147049739 +0000 UTC m=+78.080719519" lastFinishedPulling="2026-04-28 19:17:02.355421507 +0000 UTC m=+79.289091291" observedRunningTime="2026-04-28 19:17:04.042396994 +0000 UTC m=+80.976066828" watchObservedRunningTime="2026-04-28 19:17:04.043862634 +0000 UTC m=+80.977532429" Apr 28 19:17:06.624333 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:06.624287 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:17:06.624775 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:06.624348 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:17:06.630019 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:06.629971 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:17:07.039019 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:07.038949 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:17:07.575322 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:07.575275 2565 patch_prober.go:28] interesting pod/image-registry-6598fb7d78-gsbrc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 28 19:17:07.575505 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:07.575350 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" podUID="de9861e6-21f7-4675-9370-4fa19956dd76" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:17:10.887800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:10.887767 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:17:13.054452 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:13.054400 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ntdxj" event={"ID":"828ffceb-04a7-4751-b3fd-abe2a2db01c5","Type":"ContainerStarted","Data":"63ed07aa93e0613788c4d3f3e01d47894368127b8b13192b03bb4754be21c1a3"} Apr 28 19:17:13.054932 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:13.054758 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:17:13.056506 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:13.056472 2565 patch_prober.go:28] interesting pod/downloads-6bcc868b7-ntdxj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" start-of-body= Apr 28 19:17:13.056639 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:13.056544 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-ntdxj" podUID="828ffceb-04a7-4751-b3fd-abe2a2db01c5" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" Apr 28 19:17:13.083880 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:13.083813 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ntdxj" podStartSLOduration=1.565615572 podStartE2EDuration="21.083792295s" podCreationTimestamp="2026-04-28 19:16:52 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.350129582 +0000 UTC m=+70.283799358" lastFinishedPulling="2026-04-28 19:17:12.868306107 +0000 UTC m=+89.801976081" observedRunningTime="2026-04-28 19:17:13.082479695 +0000 UTC m=+90.016149492" watchObservedRunningTime="2026-04-28 19:17:13.083792295 +0000 UTC m=+90.017462092" Apr 28 19:17:14.062026 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:14.061987 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ntdxj" Apr 28 19:17:19.609943 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:19.609909 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6598fb7d78-gsbrc"] Apr 28 19:17:24.386226 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:24.386192 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cd6fd49dd-6l6vk"] Apr 28 19:17:44.632951 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:44.632895 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" podUID="de9861e6-21f7-4675-9370-4fa19956dd76" containerName="registry" containerID="cri-o://ce2742b470d9896fe6f12ab6711e51a7ae7a6cd8a25f1493798f48772ec24ee1" gracePeriod=30 Apr 28 19:17:46.153359 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.153310 2565 generic.go:358] "Generic (PLEG): container finished" podID="de9861e6-21f7-4675-9370-4fa19956dd76" containerID="ce2742b470d9896fe6f12ab6711e51a7ae7a6cd8a25f1493798f48772ec24ee1" exitCode=0 Apr 28 19:17:46.153872 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.153365 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" event={"ID":"de9861e6-21f7-4675-9370-4fa19956dd76","Type":"ContainerDied","Data":"ce2742b470d9896fe6f12ab6711e51a7ae7a6cd8a25f1493798f48772ec24ee1"} Apr 28 19:17:46.403709 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.403656 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:17:46.515822 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.515788 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-registry-certificates\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.515822 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.515824 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-image-registry-private-configuration\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516051 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.515849 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwbtb\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-kube-api-access-pwbtb\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516051 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516017 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-installation-pull-secrets\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516143 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516073 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516143 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516102 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-bound-sa-token\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516143 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516129 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-trusted-ca\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516289 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516178 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9861e6-21f7-4675-9370-4fa19956dd76-ca-trust-extracted\") pod \"de9861e6-21f7-4675-9370-4fa19956dd76\" (UID: \"de9861e6-21f7-4675-9370-4fa19956dd76\") " Apr 28 19:17:46.516289 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516210 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:46.516452 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516430 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-registry-certificates\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.516630 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.516531 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:46.518443 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.518408 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:46.518784 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.518757 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-kube-api-access-pwbtb" (OuterVolumeSpecName: "kube-api-access-pwbtb") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "kube-api-access-pwbtb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:17:46.518881 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.518779 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:46.518881 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.518843 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:17:46.518881 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.518878 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:17:46.526860 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.526831 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9861e6-21f7-4675-9370-4fa19956dd76-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "de9861e6-21f7-4675-9370-4fa19956dd76" (UID: "de9861e6-21f7-4675-9370-4fa19956dd76"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:17:46.617488 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617449 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-image-registry-private-configuration\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.617488 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617480 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwbtb\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-kube-api-access-pwbtb\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.617488 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617492 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9861e6-21f7-4675-9370-4fa19956dd76-installation-pull-secrets\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.617798 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617501 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-registry-tls\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.617798 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617511 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9861e6-21f7-4675-9370-4fa19956dd76-bound-sa-token\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.617798 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617519 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9861e6-21f7-4675-9370-4fa19956dd76-trusted-ca\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:46.617798 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:46.617527 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9861e6-21f7-4675-9370-4fa19956dd76-ca-trust-extracted\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:47.157654 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:47.157620 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" Apr 28 19:17:47.158126 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:47.157623 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598fb7d78-gsbrc" event={"ID":"de9861e6-21f7-4675-9370-4fa19956dd76","Type":"ContainerDied","Data":"57a7b399bacbeaa97eef5a704b3a73a50b0e8200dc70f43e22b809c57679f969"} Apr 28 19:17:47.158126 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:47.157744 2565 scope.go:117] "RemoveContainer" containerID="ce2742b470d9896fe6f12ab6711e51a7ae7a6cd8a25f1493798f48772ec24ee1" Apr 28 19:17:47.181745 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:47.181722 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6598fb7d78-gsbrc"] Apr 28 19:17:47.186539 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:47.186517 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6598fb7d78-gsbrc"] Apr 28 19:17:47.504590 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:47.504196 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9861e6-21f7-4675-9370-4fa19956dd76" path="/var/lib/kubelet/pods/de9861e6-21f7-4675-9370-4fa19956dd76/volumes" Apr 28 19:17:48.162117 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:48.162028 2565 generic.go:358] "Generic (PLEG): container finished" podID="d08c9e66-fe95-42f4-be54-32ce7e41a44e" containerID="8ade874ae562eac95454936bf69476cc7a468bf67723fd6c67dbce7faf37dcb3" exitCode=0 Apr 28 19:17:48.162117 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:48.162105 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" event={"ID":"d08c9e66-fe95-42f4-be54-32ce7e41a44e","Type":"ContainerDied","Data":"8ade874ae562eac95454936bf69476cc7a468bf67723fd6c67dbce7faf37dcb3"} Apr 28 19:17:48.162608 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:48.162473 2565 scope.go:117] "RemoveContainer" containerID="8ade874ae562eac95454936bf69476cc7a468bf67723fd6c67dbce7faf37dcb3" Apr 28 19:17:49.167854 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.167820 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r5k9t" event={"ID":"d08c9e66-fe95-42f4-be54-32ce7e41a44e","Type":"ContainerStarted","Data":"a2ab4ecd8d0dfcd6b8db89fb80db74de8d2ee27d625f23bb075ce34e756121f5"} Apr 28 19:17:49.408495 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.408450 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cd6fd49dd-6l6vk" podUID="932db50e-415a-4998-8daf-414ec4148fdb" containerName="console" containerID="cri-o://471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2" gracePeriod=15 Apr 28 19:17:49.668709 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.668652 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd6fd49dd-6l6vk_932db50e-415a-4998-8daf-414ec4148fdb/console/0.log" Apr 28 19:17:49.668819 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.668709 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:17:49.741790 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.741754 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-oauth-config\") pod \"932db50e-415a-4998-8daf-414ec4148fdb\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " Apr 28 19:17:49.741967 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.741810 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-service-ca\") pod \"932db50e-415a-4998-8daf-414ec4148fdb\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " Apr 28 19:17:49.741967 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.741844 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-serving-cert\") pod \"932db50e-415a-4998-8daf-414ec4148fdb\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " Apr 28 19:17:49.741967 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.741866 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-oauth-serving-cert\") pod \"932db50e-415a-4998-8daf-414ec4148fdb\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " Apr 28 19:17:49.741967 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.741891 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-console-config\") pod \"932db50e-415a-4998-8daf-414ec4148fdb\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " Apr 28 19:17:49.741967 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.741944 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bxkj\" (UniqueName: \"kubernetes.io/projected/932db50e-415a-4998-8daf-414ec4148fdb-kube-api-access-9bxkj\") pod \"932db50e-415a-4998-8daf-414ec4148fdb\" (UID: \"932db50e-415a-4998-8daf-414ec4148fdb\") " Apr 28 19:17:49.742320 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.742296 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-console-config" (OuterVolumeSpecName: "console-config") pod "932db50e-415a-4998-8daf-414ec4148fdb" (UID: "932db50e-415a-4998-8daf-414ec4148fdb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:49.742385 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.742308 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-service-ca" (OuterVolumeSpecName: "service-ca") pod "932db50e-415a-4998-8daf-414ec4148fdb" (UID: "932db50e-415a-4998-8daf-414ec4148fdb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:49.742385 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.742293 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "932db50e-415a-4998-8daf-414ec4148fdb" (UID: "932db50e-415a-4998-8daf-414ec4148fdb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:49.744105 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.744082 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "932db50e-415a-4998-8daf-414ec4148fdb" (UID: "932db50e-415a-4998-8daf-414ec4148fdb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:49.744105 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.744098 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "932db50e-415a-4998-8daf-414ec4148fdb" (UID: "932db50e-415a-4998-8daf-414ec4148fdb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:49.744245 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.744131 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/932db50e-415a-4998-8daf-414ec4148fdb-kube-api-access-9bxkj" (OuterVolumeSpecName: "kube-api-access-9bxkj") pod "932db50e-415a-4998-8daf-414ec4148fdb" (UID: "932db50e-415a-4998-8daf-414ec4148fdb"). InnerVolumeSpecName "kube-api-access-9bxkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:17:49.843462 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.843432 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-service-ca\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:49.843462 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.843459 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-serving-cert\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:49.843604 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.843471 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-oauth-serving-cert\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:49.843604 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.843480 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/932db50e-415a-4998-8daf-414ec4148fdb-console-config\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:49.843604 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.843490 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9bxkj\" (UniqueName: \"kubernetes.io/projected/932db50e-415a-4998-8daf-414ec4148fdb-kube-api-access-9bxkj\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:49.843604 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:49.843499 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/932db50e-415a-4998-8daf-414ec4148fdb-console-oauth-config\") on node \"ip-10-0-133-121.ec2.internal\" DevicePath \"\"" Apr 28 19:17:50.171870 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.171839 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd6fd49dd-6l6vk_932db50e-415a-4998-8daf-414ec4148fdb/console/0.log" Apr 28 19:17:50.172293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.171882 2565 generic.go:358] "Generic (PLEG): container finished" podID="932db50e-415a-4998-8daf-414ec4148fdb" containerID="471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2" exitCode=2 Apr 28 19:17:50.172293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.171919 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd6fd49dd-6l6vk" event={"ID":"932db50e-415a-4998-8daf-414ec4148fdb","Type":"ContainerDied","Data":"471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2"} Apr 28 19:17:50.172293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.171969 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd6fd49dd-6l6vk" event={"ID":"932db50e-415a-4998-8daf-414ec4148fdb","Type":"ContainerDied","Data":"27c6750d4d24c078cf9c3110d8fa3d6b4928ea961c3cbce743513cdeee8dc03e"} Apr 28 19:17:50.172293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.171996 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd6fd49dd-6l6vk" Apr 28 19:17:50.172293 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.172000 2565 scope.go:117] "RemoveContainer" containerID="471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2" Apr 28 19:17:50.179943 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.179917 2565 scope.go:117] "RemoveContainer" containerID="471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2" Apr 28 19:17:50.180215 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:17:50.180193 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2\": container with ID starting with 471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2 not found: ID does not exist" containerID="471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2" Apr 28 19:17:50.180270 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.180223 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2"} err="failed to get container status \"471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2\": rpc error: code = NotFound desc = could not find container \"471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2\": container with ID starting with 471243f9df6e055cab2fe462016263100e927a8efa9448992ccf4e56e8c931b2 not found: ID does not exist" Apr 28 19:17:50.194451 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.194428 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cd6fd49dd-6l6vk"] Apr 28 19:17:50.199023 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:50.199004 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cd6fd49dd-6l6vk"] Apr 28 19:17:51.502279 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:51.502244 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="932db50e-415a-4998-8daf-414ec4148fdb" path="/var/lib/kubelet/pods/932db50e-415a-4998-8daf-414ec4148fdb/volumes" Apr 28 19:17:54.184694 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:54.184616 2565 generic.go:358] "Generic (PLEG): container finished" podID="bd439354-dd75-46f4-9b42-13a91c27d851" containerID="278c8fc5ead022d7c67e3d1af4525d60bcf9eab3817485be66a8ddc284d100e7" exitCode=0 Apr 28 19:17:54.184694 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:54.184663 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fl75w" event={"ID":"bd439354-dd75-46f4-9b42-13a91c27d851","Type":"ContainerDied","Data":"278c8fc5ead022d7c67e3d1af4525d60bcf9eab3817485be66a8ddc284d100e7"} Apr 28 19:17:54.185108 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:54.185022 2565 scope.go:117] "RemoveContainer" containerID="278c8fc5ead022d7c67e3d1af4525d60bcf9eab3817485be66a8ddc284d100e7" Apr 28 19:17:55.189339 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:55.189304 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fl75w" event={"ID":"bd439354-dd75-46f4-9b42-13a91c27d851","Type":"ContainerStarted","Data":"dbd2c697eab76956a8e00cd425a887b464ca9a6f2acf6aace42cf892bab7ba51"} Apr 28 19:17:58.199341 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:58.199314 2565 generic.go:358] "Generic (PLEG): container finished" podID="f2b08f0e-bfe6-4201-9b2f-80bb5473ba65" containerID="a53e67160bffcd379ae97463e11c55c014575aa32aade6d73f521ce8b0a23d0f" exitCode=0 Apr 28 19:17:58.199725 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:58.199395 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" event={"ID":"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65","Type":"ContainerDied","Data":"a53e67160bffcd379ae97463e11c55c014575aa32aade6d73f521ce8b0a23d0f"} Apr 28 19:17:58.199793 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:58.199776 2565 scope.go:117] "RemoveContainer" containerID="a53e67160bffcd379ae97463e11c55c014575aa32aade6d73f521ce8b0a23d0f" Apr 28 19:17:59.204536 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:17:59.204500 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v5vpf" event={"ID":"f2b08f0e-bfe6-4201-9b2f-80bb5473ba65","Type":"ContainerStarted","Data":"437086c9cc50e8fad709cdb7ae7d38c3fc6a6d87ddb699718b8fa94d3a3e1167"} Apr 28 19:19:20.779594 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779562 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp"] Apr 28 19:19:20.780056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779857 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="932db50e-415a-4998-8daf-414ec4148fdb" containerName="console" Apr 28 19:19:20.780056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779868 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="932db50e-415a-4998-8daf-414ec4148fdb" containerName="console" Apr 28 19:19:20.780056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779881 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de9861e6-21f7-4675-9370-4fa19956dd76" containerName="registry" Apr 28 19:19:20.780056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779886 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9861e6-21f7-4675-9370-4fa19956dd76" containerName="registry" Apr 28 19:19:20.780056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779931 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="932db50e-415a-4998-8daf-414ec4148fdb" containerName="console" Apr 28 19:19:20.780056 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.779939 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="de9861e6-21f7-4675-9370-4fa19956dd76" containerName="registry" Apr 28 19:19:20.782779 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.782762 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:20.788871 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.788847 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 28 19:19:20.789484 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.789196 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 28 19:19:20.789484 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.789350 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 28 19:19:20.795539 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.795523 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-mf299\"" Apr 28 19:19:20.808737 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.808712 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp"] Apr 28 19:19:20.935633 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.935601 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0852b8fb-4797-4a3f-a4c0-a3388e93d65b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jkttp\" (UID: \"0852b8fb-4797-4a3f-a4c0-a3388e93d65b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:20.935791 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:20.935697 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjgl5\" (UniqueName: \"kubernetes.io/projected/0852b8fb-4797-4a3f-a4c0-a3388e93d65b-kube-api-access-rjgl5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jkttp\" (UID: \"0852b8fb-4797-4a3f-a4c0-a3388e93d65b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:21.036637 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.036555 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjgl5\" (UniqueName: \"kubernetes.io/projected/0852b8fb-4797-4a3f-a4c0-a3388e93d65b-kube-api-access-rjgl5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jkttp\" (UID: \"0852b8fb-4797-4a3f-a4c0-a3388e93d65b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:21.036637 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.036608 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0852b8fb-4797-4a3f-a4c0-a3388e93d65b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jkttp\" (UID: \"0852b8fb-4797-4a3f-a4c0-a3388e93d65b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:21.038954 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.038930 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0852b8fb-4797-4a3f-a4c0-a3388e93d65b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jkttp\" (UID: \"0852b8fb-4797-4a3f-a4c0-a3388e93d65b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:21.046215 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.046189 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjgl5\" (UniqueName: \"kubernetes.io/projected/0852b8fb-4797-4a3f-a4c0-a3388e93d65b-kube-api-access-rjgl5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jkttp\" (UID: \"0852b8fb-4797-4a3f-a4c0-a3388e93d65b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:21.092734 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.092709 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:21.237309 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.237239 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp"] Apr 28 19:19:21.239733 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:19:21.239706 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0852b8fb_4797_4a3f_a4c0_a3388e93d65b.slice/crio-9379d5cd54803f24bc39b20a18cf2b19cdceb7b539f908272a599ef54b1be20f WatchSource:0}: Error finding container 9379d5cd54803f24bc39b20a18cf2b19cdceb7b539f908272a599ef54b1be20f: Status 404 returned error can't find the container with id 9379d5cd54803f24bc39b20a18cf2b19cdceb7b539f908272a599ef54b1be20f Apr 28 19:19:21.450955 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:21.450870 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" event={"ID":"0852b8fb-4797-4a3f-a4c0-a3388e93d65b","Type":"ContainerStarted","Data":"9379d5cd54803f24bc39b20a18cf2b19cdceb7b539f908272a599ef54b1be20f"} Apr 28 19:19:25.471814 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:25.471729 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" event={"ID":"0852b8fb-4797-4a3f-a4c0-a3388e93d65b","Type":"ContainerStarted","Data":"de9eb73e0e96a1d5473a8f45a6b4362f80dc60b70d9eed052573c067dd5d9f13"} Apr 28 19:19:25.472372 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:25.471893 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:25.501195 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:25.501143 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" podStartSLOduration=1.589800825 podStartE2EDuration="5.501129752s" podCreationTimestamp="2026-04-28 19:19:20 +0000 UTC" firstStartedPulling="2026-04-28 19:19:21.241436953 +0000 UTC m=+218.175106726" lastFinishedPulling="2026-04-28 19:19:25.152765876 +0000 UTC m=+222.086435653" observedRunningTime="2026-04-28 19:19:25.50055437 +0000 UTC m=+222.434224165" watchObservedRunningTime="2026-04-28 19:19:25.501129752 +0000 UTC m=+222.434799546" Apr 28 19:19:26.209731 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.209693 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4"] Apr 28 19:19:26.212941 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.212924 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.215859 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.215835 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rcjxh\"" Apr 28 19:19:26.216216 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.216200 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 28 19:19:26.216646 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.216630 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 28 19:19:26.229665 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.229641 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4"] Apr 28 19:19:26.384089 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.384054 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d16abc0b-45ac-4eee-ad7b-660f59c09575-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.384248 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.384100 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.384248 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.384162 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8w5r\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-kube-api-access-c8w5r\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.485142 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.485062 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.485142 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.485116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8w5r\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-kube-api-access-c8w5r\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.485576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.485223 2565 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:19:26.485576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.485246 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:19:26.485576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.485270 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4: references non-existent secret key: tls.crt Apr 28 19:19:26.485576 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.485284 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d16abc0b-45ac-4eee-ad7b-660f59c09575-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.485576 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.485331 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates podName:d16abc0b-45ac-4eee-ad7b-660f59c09575 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:26.9853102 +0000 UTC m=+223.918979973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates") pod "keda-metrics-apiserver-7c9f485588-94dx4" (UID: "d16abc0b-45ac-4eee-ad7b-660f59c09575") : references non-existent secret key: tls.crt Apr 28 19:19:26.485731 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.485653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d16abc0b-45ac-4eee-ad7b-660f59c09575-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.498127 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.498105 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8w5r\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-kube-api-access-c8w5r\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.990340 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:26.990296 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:26.990502 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.990464 2565 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:19:26.990502 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.990482 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:19:26.990502 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.990502 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4: references non-existent secret key: tls.crt Apr 28 19:19:26.990605 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:26.990558 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates podName:d16abc0b-45ac-4eee-ad7b-660f59c09575 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.990542129 +0000 UTC m=+224.924211902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates") pod "keda-metrics-apiserver-7c9f485588-94dx4" (UID: "d16abc0b-45ac-4eee-ad7b-660f59c09575") : references non-existent secret key: tls.crt Apr 28 19:19:27.999963 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:27.999918 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:28.000382 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:28.000082 2565 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:19:28.000382 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:28.000104 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:19:28.000382 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:28.000122 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4: references non-existent secret key: tls.crt Apr 28 19:19:28.000382 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:28.000182 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates podName:d16abc0b-45ac-4eee-ad7b-660f59c09575 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:30.000168416 +0000 UTC m=+226.933838209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates") pod "keda-metrics-apiserver-7c9f485588-94dx4" (UID: "d16abc0b-45ac-4eee-ad7b-660f59c09575") : references non-existent secret key: tls.crt Apr 28 19:19:30.015879 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:30.015844 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:30.016312 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:30.016022 2565 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:19:30.016312 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:30.016040 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:19:30.016312 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:30.016057 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4: references non-existent secret key: tls.crt Apr 28 19:19:30.016312 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:19:30.016175 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates podName:d16abc0b-45ac-4eee-ad7b-660f59c09575 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:34.016154253 +0000 UTC m=+230.949824027 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates") pod "keda-metrics-apiserver-7c9f485588-94dx4" (UID: "d16abc0b-45ac-4eee-ad7b-660f59c09575") : references non-existent secret key: tls.crt Apr 28 19:19:34.041478 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:34.041443 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:34.044100 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:34.044076 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d16abc0b-45ac-4eee-ad7b-660f59c09575-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94dx4\" (UID: \"d16abc0b-45ac-4eee-ad7b-660f59c09575\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:34.325430 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:34.325350 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:34.445082 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:34.445052 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4"] Apr 28 19:19:34.448163 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:19:34.448133 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16abc0b_45ac_4eee_ad7b_660f59c09575.slice/crio-a1b9c51a6106a4dc05f4bbdfd496d06f6d7afbeee72f1f967c4e9cc0dcd7f2a2 WatchSource:0}: Error finding container a1b9c51a6106a4dc05f4bbdfd496d06f6d7afbeee72f1f967c4e9cc0dcd7f2a2: Status 404 returned error can't find the container with id a1b9c51a6106a4dc05f4bbdfd496d06f6d7afbeee72f1f967c4e9cc0dcd7f2a2 Apr 28 19:19:34.502097 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:34.502063 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" event={"ID":"d16abc0b-45ac-4eee-ad7b-660f59c09575","Type":"ContainerStarted","Data":"a1b9c51a6106a4dc05f4bbdfd496d06f6d7afbeee72f1f967c4e9cc0dcd7f2a2"} Apr 28 19:19:38.515386 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:38.515293 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" event={"ID":"d16abc0b-45ac-4eee-ad7b-660f59c09575","Type":"ContainerStarted","Data":"3d0b0b0fb38024a441699e5218f459e586d6cfe6774a1b569d5c28aa2a239989"} Apr 28 19:19:38.515770 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:38.515400 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:19:38.548854 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:38.548807 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" podStartSLOduration=8.740077028 podStartE2EDuration="12.548790255s" podCreationTimestamp="2026-04-28 19:19:26 +0000 UTC" firstStartedPulling="2026-04-28 19:19:34.449790106 +0000 UTC m=+231.383459880" lastFinishedPulling="2026-04-28 19:19:38.258503327 +0000 UTC m=+235.192173107" observedRunningTime="2026-04-28 19:19:38.54792229 +0000 UTC m=+235.481592086" watchObservedRunningTime="2026-04-28 19:19:38.548790255 +0000 UTC m=+235.482460051" Apr 28 19:19:46.477735 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:46.477704 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jkttp" Apr 28 19:19:49.522920 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:19:49.522889 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94dx4" Apr 28 19:20:32.773853 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.773817 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s"] Apr 28 19:20:32.776954 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.776937 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:32.779901 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.779881 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-96kt4\"" Apr 28 19:20:32.781005 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.780966 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 28 19:20:32.781110 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.780973 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:20:32.781258 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.781228 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:20:32.791226 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.791193 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s"] Apr 28 19:20:32.801086 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.801058 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-fnd7j"] Apr 28 19:20:32.804299 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.804268 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:32.806897 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.806879 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:20:32.807054 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.807033 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hdspq\"" Apr 28 19:20:32.817630 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.817607 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-fnd7j"] Apr 28 19:20:32.903542 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.903496 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j7px\" (UniqueName: \"kubernetes.io/projected/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-kube-api-access-4j7px\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:32.903542 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.903538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd8wq\" (UniqueName: \"kubernetes.io/projected/2198d378-f18c-4e35-a42e-3e910165822e-kube-api-access-xd8wq\") pod \"seaweedfs-86cc847c5c-fnd7j\" (UID: \"2198d378-f18c-4e35-a42e-3e910165822e\") " pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:32.903808 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.903632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2198d378-f18c-4e35-a42e-3e910165822e-data\") pod \"seaweedfs-86cc847c5c-fnd7j\" (UID: \"2198d378-f18c-4e35-a42e-3e910165822e\") " pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:32.903808 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:32.903665 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.004800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.004742 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j7px\" (UniqueName: \"kubernetes.io/projected/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-kube-api-access-4j7px\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.004800 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.004798 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd8wq\" (UniqueName: \"kubernetes.io/projected/2198d378-f18c-4e35-a42e-3e910165822e-kube-api-access-xd8wq\") pod \"seaweedfs-86cc847c5c-fnd7j\" (UID: \"2198d378-f18c-4e35-a42e-3e910165822e\") " pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:33.005066 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.004840 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2198d378-f18c-4e35-a42e-3e910165822e-data\") pod \"seaweedfs-86cc847c5c-fnd7j\" (UID: \"2198d378-f18c-4e35-a42e-3e910165822e\") " pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:33.005066 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.004858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.005066 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:20:33.004969 2565 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 28 19:20:33.005189 ip-10-0-133-121 kubenswrapper[2565]: E0428 19:20:33.005068 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-cert podName:8cbcf98b-b80d-43a0-b2da-ce216a2508d7 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:33.505048294 +0000 UTC m=+290.438718068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-cert") pod "llmisvc-controller-manager-68cc5db7c4-rkk8s" (UID: "8cbcf98b-b80d-43a0-b2da-ce216a2508d7") : secret "llmisvc-webhook-server-cert" not found Apr 28 19:20:33.005345 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.005327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2198d378-f18c-4e35-a42e-3e910165822e-data\") pod \"seaweedfs-86cc847c5c-fnd7j\" (UID: \"2198d378-f18c-4e35-a42e-3e910165822e\") " pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:33.018752 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.018720 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j7px\" (UniqueName: \"kubernetes.io/projected/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-kube-api-access-4j7px\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.019350 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.019334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd8wq\" (UniqueName: \"kubernetes.io/projected/2198d378-f18c-4e35-a42e-3e910165822e-kube-api-access-xd8wq\") pod \"seaweedfs-86cc847c5c-fnd7j\" (UID: \"2198d378-f18c-4e35-a42e-3e910165822e\") " pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:33.114043 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.113949 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:33.238728 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.238686 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-fnd7j"] Apr 28 19:20:33.241135 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:20:33.241108 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2198d378_f18c_4e35_a42e_3e910165822e.slice/crio-a4855d97934c12c5a2f46e45355704a1b54add31d4996c59539937f7356366f9 WatchSource:0}: Error finding container a4855d97934c12c5a2f46e45355704a1b54add31d4996c59539937f7356366f9: Status 404 returned error can't find the container with id a4855d97934c12c5a2f46e45355704a1b54add31d4996c59539937f7356366f9 Apr 28 19:20:33.509084 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.509045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.511446 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.511426 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cbcf98b-b80d-43a0-b2da-ce216a2508d7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rkk8s\" (UID: \"8cbcf98b-b80d-43a0-b2da-ce216a2508d7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.684291 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.684246 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-fnd7j" event={"ID":"2198d378-f18c-4e35-a42e-3e910165822e","Type":"ContainerStarted","Data":"a4855d97934c12c5a2f46e45355704a1b54add31d4996c59539937f7356366f9"} Apr 28 19:20:33.687799 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.687771 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:33.862093 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:33.862043 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s"] Apr 28 19:20:33.883028 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:20:33.882971 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8cbcf98b_b80d_43a0_b2da_ce216a2508d7.slice/crio-05f1c1d1ab9b9e997251e9a8bb08e9a7680e30fd85641a56a801390fabafba8d WatchSource:0}: Error finding container 05f1c1d1ab9b9e997251e9a8bb08e9a7680e30fd85641a56a801390fabafba8d: Status 404 returned error can't find the container with id 05f1c1d1ab9b9e997251e9a8bb08e9a7680e30fd85641a56a801390fabafba8d Apr 28 19:20:34.688181 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:34.688145 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" event={"ID":"8cbcf98b-b80d-43a0-b2da-ce216a2508d7","Type":"ContainerStarted","Data":"05f1c1d1ab9b9e997251e9a8bb08e9a7680e30fd85641a56a801390fabafba8d"} Apr 28 19:20:37.701077 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:37.701038 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-fnd7j" event={"ID":"2198d378-f18c-4e35-a42e-3e910165822e","Type":"ContainerStarted","Data":"b75ef1cea338d89ef928dbc712d1e0f5bc33ceb959001858c7c3199017bc462b"} Apr 28 19:20:37.701532 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:37.701172 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:20:37.702466 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:37.702444 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" event={"ID":"8cbcf98b-b80d-43a0-b2da-ce216a2508d7","Type":"ContainerStarted","Data":"6c171087c35849dcc7c5afaf5273256d160cda9de588e84dc22bcf8a3df31ac1"} Apr 28 19:20:37.702553 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:37.702543 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:20:37.719799 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:37.719748 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-fnd7j" podStartSLOduration=2.018229601 podStartE2EDuration="5.719735592s" podCreationTimestamp="2026-04-28 19:20:32 +0000 UTC" firstStartedPulling="2026-04-28 19:20:33.242362003 +0000 UTC m=+290.176031776" lastFinishedPulling="2026-04-28 19:20:36.943867986 +0000 UTC m=+293.877537767" observedRunningTime="2026-04-28 19:20:37.718734511 +0000 UTC m=+294.652404304" watchObservedRunningTime="2026-04-28 19:20:37.719735592 +0000 UTC m=+294.653405388" Apr 28 19:20:37.742155 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:37.742105 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" podStartSLOduration=2.738989099 podStartE2EDuration="5.742090197s" podCreationTimestamp="2026-04-28 19:20:32 +0000 UTC" firstStartedPulling="2026-04-28 19:20:33.887458911 +0000 UTC m=+290.821128685" lastFinishedPulling="2026-04-28 19:20:36.890560007 +0000 UTC m=+293.824229783" observedRunningTime="2026-04-28 19:20:37.740286731 +0000 UTC m=+294.673956527" watchObservedRunningTime="2026-04-28 19:20:37.742090197 +0000 UTC m=+294.675760040" Apr 28 19:20:43.444963 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:43.444934 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:20:43.445512 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:43.445342 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:20:43.455389 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:43.455366 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:20:43.707954 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:20:43.707864 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-fnd7j" Apr 28 19:21:08.707201 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:08.707170 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rkk8s" Apr 28 19:21:43.166940 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.166864 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-b95ts"] Apr 28 19:21:43.169402 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.169381 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.172874 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.172850 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 28 19:21:43.173000 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.172853 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-wb4gh\"" Apr 28 19:21:43.179728 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.179692 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-b95ts"] Apr 28 19:21:43.266055 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.266010 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rkh\" (UniqueName: \"kubernetes.io/projected/8db2446d-89d4-4cef-8a23-804092ec958a-kube-api-access-99rkh\") pod \"model-serving-api-86f7b4b499-b95ts\" (UID: \"8db2446d-89d4-4cef-8a23-804092ec958a\") " pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.266234 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.266128 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db2446d-89d4-4cef-8a23-804092ec958a-tls-certs\") pod \"model-serving-api-86f7b4b499-b95ts\" (UID: \"8db2446d-89d4-4cef-8a23-804092ec958a\") " pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.366911 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.366877 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99rkh\" (UniqueName: \"kubernetes.io/projected/8db2446d-89d4-4cef-8a23-804092ec958a-kube-api-access-99rkh\") pod \"model-serving-api-86f7b4b499-b95ts\" (UID: \"8db2446d-89d4-4cef-8a23-804092ec958a\") " pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.367114 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.366939 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db2446d-89d4-4cef-8a23-804092ec958a-tls-certs\") pod \"model-serving-api-86f7b4b499-b95ts\" (UID: \"8db2446d-89d4-4cef-8a23-804092ec958a\") " pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.369387 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.369358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db2446d-89d4-4cef-8a23-804092ec958a-tls-certs\") pod \"model-serving-api-86f7b4b499-b95ts\" (UID: \"8db2446d-89d4-4cef-8a23-804092ec958a\") " pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.405940 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.405905 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rkh\" (UniqueName: \"kubernetes.io/projected/8db2446d-89d4-4cef-8a23-804092ec958a-kube-api-access-99rkh\") pod \"model-serving-api-86f7b4b499-b95ts\" (UID: \"8db2446d-89d4-4cef-8a23-804092ec958a\") " pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.483609 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.483525 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-wb4gh\"" Apr 28 19:21:43.491494 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.491457 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:43.625673 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.625647 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-b95ts"] Apr 28 19:21:43.627767 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:21:43.627738 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db2446d_89d4_4cef_8a23_804092ec958a.slice/crio-8843c3512dade3f14d54f8e1ba20b5965443b0763ba20133ff8db511fa044e7a WatchSource:0}: Error finding container 8843c3512dade3f14d54f8e1ba20b5965443b0763ba20133ff8db511fa044e7a: Status 404 returned error can't find the container with id 8843c3512dade3f14d54f8e1ba20b5965443b0763ba20133ff8db511fa044e7a Apr 28 19:21:43.629604 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.629585 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:21:43.912737 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:43.912702 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-b95ts" event={"ID":"8db2446d-89d4-4cef-8a23-804092ec958a","Type":"ContainerStarted","Data":"8843c3512dade3f14d54f8e1ba20b5965443b0763ba20133ff8db511fa044e7a"} Apr 28 19:21:46.927121 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:46.927087 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-b95ts" event={"ID":"8db2446d-89d4-4cef-8a23-804092ec958a","Type":"ContainerStarted","Data":"8c96ad38bf8c98386c985d614036068acf0d8d28ceb28f86cdc0c33ba962303e"} Apr 28 19:21:46.927496 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:46.927204 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:21:46.946786 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:46.946743 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-b95ts" podStartSLOduration=1.664275229 podStartE2EDuration="3.946731238s" podCreationTimestamp="2026-04-28 19:21:43 +0000 UTC" firstStartedPulling="2026-04-28 19:21:43.629768999 +0000 UTC m=+360.563438787" lastFinishedPulling="2026-04-28 19:21:45.912225022 +0000 UTC m=+362.845894796" observedRunningTime="2026-04-28 19:21:46.945136936 +0000 UTC m=+363.878806732" watchObservedRunningTime="2026-04-28 19:21:46.946731238 +0000 UTC m=+363.880401034" Apr 28 19:21:57.934469 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:21:57.934440 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-b95ts" Apr 28 19:22:22.453032 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.453001 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75bf858596-z4dln"] Apr 28 19:22:22.460972 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.460951 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.468192 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.468167 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:22:22.468648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.468324 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ffxhh\"" Apr 28 19:22:22.468648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.468379 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:22:22.468648 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.468450 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:22:22.469405 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.469375 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75bf858596-z4dln"] Apr 28 19:22:22.469650 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.469631 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:22:22.471561 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.471543 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:22:22.481668 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.481642 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 28 19:22:22.585687 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585649 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c759b144-e8b3-4336-9730-b9342c5d91d9-console-oauth-config\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.585687 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585691 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-oauth-serving-cert\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.585898 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585751 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rft4\" (UniqueName: \"kubernetes.io/projected/c759b144-e8b3-4336-9730-b9342c5d91d9-kube-api-access-9rft4\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.585898 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585795 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-console-config\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.585898 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585825 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c759b144-e8b3-4336-9730-b9342c5d91d9-console-serving-cert\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.585898 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585840 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-trusted-ca-bundle\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.586101 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.585962 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-service-ca\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.686953 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.686906 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c759b144-e8b3-4336-9730-b9342c5d91d9-console-oauth-config\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.686953 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.686958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-oauth-serving-cert\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687203 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.686998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rft4\" (UniqueName: \"kubernetes.io/projected/c759b144-e8b3-4336-9730-b9342c5d91d9-kube-api-access-9rft4\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687203 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687021 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-console-config\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687203 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687047 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c759b144-e8b3-4336-9730-b9342c5d91d9-console-serving-cert\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687203 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-trusted-ca-bundle\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687203 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687098 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-service-ca\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687746 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687709 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-oauth-serving-cert\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687876 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-service-ca\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.687876 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.687822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-console-config\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.688064 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.688039 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c759b144-e8b3-4336-9730-b9342c5d91d9-trusted-ca-bundle\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.689543 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.689520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c759b144-e8b3-4336-9730-b9342c5d91d9-console-oauth-config\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.689790 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.689766 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c759b144-e8b3-4336-9730-b9342c5d91d9-console-serving-cert\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.696403 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.696380 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rft4\" (UniqueName: \"kubernetes.io/projected/c759b144-e8b3-4336-9730-b9342c5d91d9-kube-api-access-9rft4\") pod \"console-75bf858596-z4dln\" (UID: \"c759b144-e8b3-4336-9730-b9342c5d91d9\") " pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:22.771742 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:22.771708 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:23.133628 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:23.133394 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75bf858596-z4dln"] Apr 28 19:22:23.136511 ip-10-0-133-121 kubenswrapper[2565]: W0428 19:22:23.136482 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc759b144_e8b3_4336_9730_b9342c5d91d9.slice/crio-ed6cc988778e39dc8718fc78f984bc9ced2f9c5328dc0591d2d56a94066ec892 WatchSource:0}: Error finding container ed6cc988778e39dc8718fc78f984bc9ced2f9c5328dc0591d2d56a94066ec892: Status 404 returned error can't find the container with id ed6cc988778e39dc8718fc78f984bc9ced2f9c5328dc0591d2d56a94066ec892 Apr 28 19:22:24.050750 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:24.050715 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bf858596-z4dln" event={"ID":"c759b144-e8b3-4336-9730-b9342c5d91d9","Type":"ContainerStarted","Data":"f98179744ce597109aa8c9f7650894d3b24721dd79125031f63c5517e6ed3934"} Apr 28 19:22:24.050750 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:24.050749 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bf858596-z4dln" event={"ID":"c759b144-e8b3-4336-9730-b9342c5d91d9","Type":"ContainerStarted","Data":"ed6cc988778e39dc8718fc78f984bc9ced2f9c5328dc0591d2d56a94066ec892"} Apr 28 19:22:24.076962 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:24.076914 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75bf858596-z4dln" podStartSLOduration=2.076900256 podStartE2EDuration="2.076900256s" podCreationTimestamp="2026-04-28 19:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:22:24.076213798 +0000 UTC m=+401.009883598" watchObservedRunningTime="2026-04-28 19:22:24.076900256 +0000 UTC m=+401.010570051" Apr 28 19:22:32.772519 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:32.772477 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:32.772519 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:32.772521 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:32.777671 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:32.777637 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:22:33.084063 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:22:33.083969 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75bf858596-z4dln" Apr 28 19:25:43.469937 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:25:43.469908 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:25:43.470489 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:25:43.470223 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:30:43.500636 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:30:43.500536 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:30:43.503875 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:30:43.503852 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:35:43.529707 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:35:43.529671 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:35:43.533314 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:35:43.533290 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:40:43.551936 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:40:43.551909 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:40:43.555618 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:40:43.555593 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:45:43.574365 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:45:43.574336 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:45:43.578954 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:45:43.578930 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:50:43.596420 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:50:43.596392 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:50:43.603086 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:50:43.603062 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:55:43.617542 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:55:43.617507 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 19:55:43.623916 ip-10-0-133-121 kubenswrapper[2565]: I0428 19:55:43.623892 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:00:43.639035 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:00:43.638938 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:00:43.649455 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:00:43.649433 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:05:43.663709 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:05:43.663674 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:05:43.672222 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:05:43.672201 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:10:43.689550 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:10:43.689523 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:10:43.696225 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:10:43.696203 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:15:43.711239 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:15:43.711207 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:15:43.718790 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:15:43.718767 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:20:16.632771 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:16.632737 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pd44w_113829ad-8ae9-4887-9929-882aabb0a1cb/global-pull-secret-syncer/0.log" Apr 28 20:20:16.694945 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:16.694917 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4tvmc_b1a47c9c-0b33-44f3-8e5a-5f69cab573b7/konnectivity-agent/0.log" Apr 28 20:20:16.794353 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:16.794321 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-121.ec2.internal_57b3119f12c26029aa487a8b6a06e517/haproxy/0.log" Apr 28 20:20:20.068194 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.068161 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-ml5vd_a4826c50-2384-48cf-853b-ab348926b6e5/cluster-monitoring-operator/0.log" Apr 28 20:20:20.335344 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.335267 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jq5hc_93caaf59-62bc-4dc7-bb6d-81120b7144cb/node-exporter/0.log" Apr 28 20:20:20.357330 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.357298 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jq5hc_93caaf59-62bc-4dc7-bb6d-81120b7144cb/kube-rbac-proxy/0.log" Apr 28 20:20:20.378048 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.378024 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jq5hc_93caaf59-62bc-4dc7-bb6d-81120b7144cb/init-textfile/0.log" Apr 28 20:20:20.728957 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.728874 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-97v2f_f26ab95f-87b7-46a9-a5db-7a5590ec350d/prometheus-operator/0.log" Apr 28 20:20:20.746347 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.746319 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-97v2f_f26ab95f-87b7-46a9-a5db-7a5590ec350d/kube-rbac-proxy/0.log" Apr 28 20:20:20.772882 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:20.772854 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9b4wj_3f4c4045-dbe3-4ff4-866e-b265cc183438/prometheus-operator-admission-webhook/0.log" Apr 28 20:20:22.176307 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:22.176264 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-l29s7_42b7e7a8-ff86-48ac-bd59-aab3db697272/networking-console-plugin/0.log" Apr 28 20:20:22.618518 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:22.618486 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/1.log" Apr 28 20:20:22.625837 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:22.625812 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p6wm2_f81f5464-aa16-489d-80bf-9e5bf953f7af/console-operator/2.log" Apr 28 20:20:23.009486 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.009454 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75bf858596-z4dln_c759b144-e8b3-4336-9730-b9342c5d91d9/console/0.log" Apr 28 20:20:23.051567 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.051540 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ntdxj_828ffceb-04a7-4751-b3fd-abe2a2db01c5/download-server/0.log" Apr 28 20:20:23.448930 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.448858 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7r6rp_29ca57b8-ae4f-4261-a245-529c6cfa8449/volume-data-source-validator/0.log" Apr 28 20:20:23.550784 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.550749 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2"] Apr 28 20:20:23.554144 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.554124 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.556528 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.556496 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r9lxr\"/\"default-dockercfg-7h6qj\"" Apr 28 20:20:23.556528 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.556514 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r9lxr\"/\"openshift-service-ca.crt\"" Apr 28 20:20:23.557660 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.557639 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r9lxr\"/\"kube-root-ca.crt\"" Apr 28 20:20:23.565124 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.565094 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2"] Apr 28 20:20:23.686645 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.686608 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-podres\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.686645 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.686644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-sys\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.686905 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.686667 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-lib-modules\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.686905 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.686728 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzml\" (UniqueName: \"kubernetes.io/projected/440e164a-a25d-43c6-b6bc-e507579f5321-kube-api-access-4wzml\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.686905 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.686794 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-proc\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.787720 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.787681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-lib-modules\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.787913 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.787730 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzml\" (UniqueName: \"kubernetes.io/projected/440e164a-a25d-43c6-b6bc-e507579f5321-kube-api-access-4wzml\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.787913 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.787776 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-proc\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.787913 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.787881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-lib-modules\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.787913 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.787899 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-proc\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.788114 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.788010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-podres\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.788114 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.788040 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-sys\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.788114 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.788105 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-sys\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.788216 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.788140 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/440e164a-a25d-43c6-b6bc-e507579f5321-podres\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.799084 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.796035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzml\" (UniqueName: \"kubernetes.io/projected/440e164a-a25d-43c6-b6bc-e507579f5321-kube-api-access-4wzml\") pod \"perf-node-gather-daemonset-wrrg2\" (UID: \"440e164a-a25d-43c6-b6bc-e507579f5321\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:23.864034 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:23.864001 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:24.193942 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.193916 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2"] Apr 28 20:20:24.196474 ip-10-0-133-121 kubenswrapper[2565]: W0428 20:20:24.196444 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod440e164a_a25d_43c6_b6bc_e507579f5321.slice/crio-947f742f3f4a2010c1277485b5c1561a78b28e532b541098767e1bf1ae374a8a WatchSource:0}: Error finding container 947f742f3f4a2010c1277485b5c1561a78b28e532b541098767e1bf1ae374a8a: Status 404 returned error can't find the container with id 947f742f3f4a2010c1277485b5c1561a78b28e532b541098767e1bf1ae374a8a Apr 28 20:20:24.198392 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.198374 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:20:24.261840 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.261818 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xzw6d_85a1e12e-6bf9-4548-83dd-a69765a8c24d/dns/0.log" Apr 28 20:20:24.286260 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.286236 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xzw6d_85a1e12e-6bf9-4548-83dd-a69765a8c24d/kube-rbac-proxy/0.log" Apr 28 20:20:24.337810 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.337784 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9vshb_a521c111-aa4b-4eda-b86b-7b9f76fcd75f/dns-node-resolver/0.log" Apr 28 20:20:24.360593 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.360555 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" event={"ID":"440e164a-a25d-43c6-b6bc-e507579f5321","Type":"ContainerStarted","Data":"9477322cc21888ff5ea77cb9f2dfc9ac40e3b58d066bdf55ecbd2c75f4316e7e"} Apr 28 20:20:24.360593 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.360594 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" event={"ID":"440e164a-a25d-43c6-b6bc-e507579f5321","Type":"ContainerStarted","Data":"947f742f3f4a2010c1277485b5c1561a78b28e532b541098767e1bf1ae374a8a"} Apr 28 20:20:24.360785 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.360627 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:24.377472 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.377422 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" podStartSLOduration=1.377409022 podStartE2EDuration="1.377409022s" podCreationTimestamp="2026-04-28 20:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:20:24.376144631 +0000 UTC m=+3881.309814428" watchObservedRunningTime="2026-04-28 20:20:24.377409022 +0000 UTC m=+3881.311078819" Apr 28 20:20:24.822233 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:24.822186 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jn4w7_0b82110c-7495-4dba-b0fc-b29ca1b890f4/node-ca/0.log" Apr 28 20:20:25.547709 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:25.547678 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d5c596596-vxxm2_ae143e08-b979-475a-abb9-654fdf653811/router/0.log" Apr 28 20:20:25.904625 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:25.904537 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t24nc_cd28326f-9576-4715-a0b0-f6812113130a/serve-healthcheck-canary/0.log" Apr 28 20:20:26.243009 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:26.242910 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-fl75w_bd439354-dd75-46f4-9b42-13a91c27d851/insights-operator/0.log" Apr 28 20:20:26.244636 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:26.244613 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-fl75w_bd439354-dd75-46f4-9b42-13a91c27d851/insights-operator/1.log" Apr 28 20:20:26.334890 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:26.334857 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hvxg8_94af290b-e61d-41bf-b802-a973593865a7/kube-rbac-proxy/0.log" Apr 28 20:20:26.355762 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:26.355739 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hvxg8_94af290b-e61d-41bf-b802-a973593865a7/exporter/0.log" Apr 28 20:20:26.377248 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:26.377217 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hvxg8_94af290b-e61d-41bf-b802-a973593865a7/extractor/0.log" Apr 28 20:20:29.674994 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:29.674946 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-rkk8s_8cbcf98b-b80d-43a0-b2da-ce216a2508d7/manager/0.log" Apr 28 20:20:29.696891 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:29.696871 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-b95ts_8db2446d-89d4-4cef-8a23-804092ec958a/server/0.log" Apr 28 20:20:30.373564 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:30.373540 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-wrrg2" Apr 28 20:20:30.567233 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:30.567205 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-fnd7j_2198d378-f18c-4e35-a42e-3e910165822e/seaweedfs/0.log" Apr 28 20:20:34.665303 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:34.665269 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mk44d_88aaf3ad-9181-4a09-9d18-1650fe56cac6/migrator/0.log" Apr 28 20:20:34.685606 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:34.685582 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mk44d_88aaf3ad-9181-4a09-9d18-1650fe56cac6/graceful-termination/0.log" Apr 28 20:20:34.993948 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:34.993859 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-r5k9t_d08c9e66-fe95-42f4-be54-32ce7e41a44e/kube-storage-version-migrator-operator/1.log" Apr 28 20:20:34.995230 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:34.995207 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-r5k9t_d08c9e66-fe95-42f4-be54-32ce7e41a44e/kube-storage-version-migrator-operator/0.log" Apr 28 20:20:36.237574 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.237546 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/kube-multus-additional-cni-plugins/0.log" Apr 28 20:20:36.258296 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.258263 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/egress-router-binary-copy/0.log" Apr 28 20:20:36.281723 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.281698 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/cni-plugins/0.log" Apr 28 20:20:36.304417 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.304389 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/bond-cni-plugin/0.log" Apr 28 20:20:36.327113 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.327086 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/routeoverride-cni/0.log" Apr 28 20:20:36.347691 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.347665 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/whereabouts-cni-bincopy/0.log" Apr 28 20:20:36.368413 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.368389 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xt62h_0c174017-a3dc-4241-8008-c41fd1ae8cec/whereabouts-cni/0.log" Apr 28 20:20:36.457127 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.457086 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lqw5f_83a284ae-2839-4ef8-a791-9c32c55d6694/kube-multus/0.log" Apr 28 20:20:36.534825 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.534796 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nfhpq_a559869f-dc8c-4397-aa54-b59c274faa74/network-metrics-daemon/0.log" Apr 28 20:20:36.574609 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:36.574585 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nfhpq_a559869f-dc8c-4397-aa54-b59c274faa74/kube-rbac-proxy/0.log" Apr 28 20:20:37.751057 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.751014 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/ovn-controller/0.log" Apr 28 20:20:37.820696 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.820661 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/ovn-acl-logging/0.log" Apr 28 20:20:37.845672 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.845637 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/kube-rbac-proxy-node/0.log" Apr 28 20:20:37.868213 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.868183 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:20:37.885695 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.885670 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/northd/0.log" Apr 28 20:20:37.912389 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.912358 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/nbdb/0.log" Apr 28 20:20:37.937040 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:37.937011 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/sbdb/0.log" Apr 28 20:20:38.102785 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:38.102740 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hkn59_f0ca223d-22df-4d91-a877-7adbc2efde17/ovnkube-controller/0.log" Apr 28 20:20:39.301181 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:39.301147 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-9kcz9_97ec2c5a-89fa-4b8f-9919-ad126433ee06/check-endpoints/0.log" Apr 28 20:20:39.381131 ip-10-0-133-121 kubenswrapper[2565]: I0428 20:20:39.381087 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zj9qs_d3913cb5-cdc7-4e4c-9f54-04992f3a0bcf/network-check-target-container/0.log"