Apr 21 15:32:40.760765 ip-10-0-136-162 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:32:40.760774 ip-10-0-136-162 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:32:40.760781 ip-10-0-136-162 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:32:40.761035 ip-10-0-136-162 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:32:50.874887 ip-10-0-136-162 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:32:50.874904 ip-10-0-136-162 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 34805d98f5f54af5a41c4745ee0731c3 -- Apr 21 15:35:11.167891 ip-10-0-136-162 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:35:11.583049 ip-10-0-136-162 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:11.583049 ip-10-0-136-162 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:35:11.583049 ip-10-0-136-162 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:11.583049 ip-10-0-136-162 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:35:11.583049 ip-10-0-136-162 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:11.585297 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.585197 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:35:11.590111 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590087 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:11.590111 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590109 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:11.590111 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590113 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590116 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590120 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590122 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590126 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590129 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590131 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590136 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590141 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590144 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590147 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590159 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590162 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590165 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590168 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590170 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590173 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590177 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590181 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:11.590221 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590184 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590187 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590190 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590192 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590195 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590198 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590200 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590204 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590206 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590209 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590211 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590214 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590217 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590219 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590222 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590226 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590229 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590232 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590235 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:11.590701 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590238 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590240 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590243 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590245 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590254 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590258 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590261 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590263 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590266 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590269 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590271 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590274 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590277 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590279 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590282 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590285 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590288 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590290 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590293 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590296 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:11.591172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590298 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590301 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590304 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590306 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590309 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590311 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590316 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590318 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590323 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590325 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590328 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590331 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590333 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590336 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590339 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590342 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590344 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590347 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590349 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590352 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:11.591688 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590354 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590357 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590359 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590362 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590365 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590367 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590810 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590816 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590819 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590821 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590824 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590827 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590830 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590834 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590837 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590840 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590843 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590845 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590849 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590851 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:11.592182 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590854 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590856 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590859 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590862 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590864 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590866 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590870 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590873 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590876 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590879 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590882 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590885 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590888 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590891 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590893 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590896 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590898 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590901 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590903 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590907 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:11.592697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590910 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590913 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590915 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590918 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590920 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590923 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590926 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590928 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590931 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590933 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590936 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590939 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590942 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590945 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590947 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590949 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590952 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590956 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590961 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:11.593219 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590963 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590967 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590969 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590972 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590975 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590978 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590981 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590983 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590986 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590989 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590992 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590994 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590996 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.590999 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591002 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591004 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591007 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591010 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591012 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591015 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:11.593716 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591017 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591020 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591023 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591025 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591028 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591031 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591033 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591036 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591038 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591040 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591043 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591045 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.591048 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591930 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591943 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591951 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591957 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591961 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591965 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591969 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591975 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:35:11.594244 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591978 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591981 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591985 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591988 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591991 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591995 2576 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.591998 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592001 2576 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592004 2576 flags.go:64] FLAG: --cloud-config="" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592007 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592010 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592014 2576 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592017 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592021 2576 flags.go:64] FLAG: --config-dir="" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592024 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592027 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592032 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592036 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592039 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592043 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592046 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592049 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592052 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592055 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592058 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:35:11.594789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592063 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592068 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592071 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592074 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592077 2576 flags.go:64] FLAG: --enable-server="true" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592080 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592086 2576 flags.go:64] FLAG: --event-burst="100" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592090 2576 flags.go:64] FLAG: --event-qps="50" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592093 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592096 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592100 2576 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592104 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592107 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592111 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592114 2576 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592117 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592120 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592123 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592126 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592129 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592132 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592135 2576 flags.go:64] FLAG: --feature-gates="" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592139 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592142 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592146 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:35:11.595405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592150 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592153 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592156 2576 flags.go:64] FLAG: --help="false" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592159 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-136-162.ec2.internal" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592163 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592166 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592169 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592173 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592177 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592180 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592183 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592187 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592190 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592193 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592197 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592200 2576 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592203 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592206 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592210 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592213 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592216 2576 flags.go:64] FLAG: --lock-file="" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592219 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592222 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592225 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:35:11.596066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592230 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592233 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592236 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592239 2576 flags.go:64] FLAG: --logging-format="text" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592242 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592246 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592249 2576 flags.go:64] FLAG: --manifest-url="" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592252 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592257 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592260 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592265 2576 flags.go:64] FLAG: --max-pods="110" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592269 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592272 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592275 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592279 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592282 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592285 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592288 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592298 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592302 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592305 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592308 2576 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592311 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:35:11.596673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592318 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592321 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592325 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592328 2576 flags.go:64] FLAG: --port="10250" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592331 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592334 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0779c9b108bb58169" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592337 2576 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592340 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592343 2576 flags.go:64] FLAG: --register-node="true" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592346 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592349 2576 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592353 2576 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592356 2576 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592359 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592362 2576 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592366 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592369 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592372 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592375 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592382 2576 flags.go:64] FLAG: --runonce="false" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592389 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592393 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592396 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592399 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592405 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592408 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:35:11.597250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592411 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592414 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592417 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592421 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592424 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592427 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592430 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592434 2576 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592436 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592442 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592445 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592449 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592453 2576 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592456 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592459 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592462 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592465 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592468 2576 flags.go:64] FLAG: --v="2" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592473 2576 flags.go:64] FLAG: --version="false" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592477 2576 flags.go:64] FLAG: --vmodule="" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592482 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.592485 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592611 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592615 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:11.597904 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592619 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592622 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592627 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592630 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592633 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592636 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592640 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592642 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592645 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592648 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592651 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592653 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592656 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592659 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592662 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592664 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592667 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592669 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592672 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592675 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:11.598598 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592677 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592680 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592683 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592686 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592688 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592691 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592694 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592697 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592700 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592704 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592708 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592712 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592714 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592717 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592722 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592725 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592728 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592732 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592736 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:11.599096 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592738 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592741 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592744 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592746 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592749 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592752 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592754 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592757 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592760 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592762 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592765 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592769 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592772 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592775 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592778 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592780 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592783 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592786 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592788 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592791 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:11.600290 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592793 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592796 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592798 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592801 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592803 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592806 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592809 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592812 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592815 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592818 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592821 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592825 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592827 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592830 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592833 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592835 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592838 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592840 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592843 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592845 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:11.600815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592848 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592850 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592853 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592856 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.592858 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.593504 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.600232 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.600250 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600300 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600305 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600309 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600312 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600315 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600318 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600321 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:11.601323 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600325 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600330 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600333 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600336 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600340 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600342 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600346 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600348 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600352 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600354 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600357 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600361 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600365 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600368 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600372 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600375 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600378 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600381 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600384 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600387 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:11.601730 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600390 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600393 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600396 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600399 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600402 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600405 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600408 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600411 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600414 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600417 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600419 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600422 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600424 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600428 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600431 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600434 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600436 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600439 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600442 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:11.602241 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600446 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600449 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600452 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600454 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600457 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600459 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600462 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600464 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600467 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600470 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600472 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600475 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600477 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600481 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600483 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600501 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600506 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600509 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600512 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600514 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:11.602719 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600517 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600519 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600522 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600525 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600527 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600530 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600533 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600535 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600538 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600540 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600543 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600545 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600550 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600562 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600565 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600568 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600571 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600574 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600576 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:11.603209 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600579 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.600584 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600723 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600729 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600732 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600735 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600738 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600740 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600743 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600746 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600748 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600751 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600754 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600756 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600759 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600761 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:11.603697 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600764 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600766 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600769 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600771 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600774 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600776 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600778 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600781 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600784 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600788 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600791 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600800 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600803 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600806 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600809 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600812 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600815 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600817 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600820 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:11.604109 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600823 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600827 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600830 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600832 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600835 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600838 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600841 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600844 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600847 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600850 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600852 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600855 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600857 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600860 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600862 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600865 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600868 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600871 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600873 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:11.604588 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600876 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600878 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600881 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600884 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600886 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600889 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600897 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600900 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600902 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600905 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600907 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600910 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600912 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600915 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600917 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600920 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600922 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600925 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600928 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600930 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600933 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:11.605060 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600936 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600938 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600941 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600944 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600946 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600949 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600951 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600954 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600957 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600959 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600962 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600964 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:11.600967 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.600972 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.601160 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:35:11.605711 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.603468 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:35:11.606086 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.604292 2576 server.go:1019] "Starting client certificate rotation" Apr 21 15:35:11.606086 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.604419 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:11.606086 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.604817 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:11.628140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.628113 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:11.633818 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.633790 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:11.649426 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.649404 2576 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:35:11.656504 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.656467 2576 log.go:25] "Validated CRI v1 image API" Apr 21 15:35:11.657920 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.657904 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:35:11.661719 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.661694 2576 fs.go:135] Filesystem UUIDs: map[0dc62feb-7146-4464-971a-b157d8b63f6b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 aaca8cdf-d537-4bf7-a741-d142e5b8c274:/dev/nvme0n1p3] Apr 21 15:35:11.661804 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.661719 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:35:11.666013 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.665992 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:11.667962 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.667838 2576 manager.go:217] Machine: {Timestamp:2026-04-21 15:35:11.665723118 +0000 UTC m=+0.383800477 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101546 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2041107dbebf9f87af502113fc1493 SystemUUID:ec204110-7dbe-bf9f-87af-502113fc1493 BootID:34805d98-f5f5-4af5-a41c-4745ee0731c3 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ca:85:ad:a4:81 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ca:85:ad:a4:81 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:71:35:6b:54:1c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:35:11.667962 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.667957 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:35:11.668061 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.668050 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:35:11.669803 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.669773 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:35:11.669959 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.669805 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-162.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:35:11.670005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.669970 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:35:11.670005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.669979 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:35:11.670005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.669992 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:11.670720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.670709 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:11.672573 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.672561 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:11.672691 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.672674 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:35:11.674864 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.674854 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:35:11.674908 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.674873 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:35:11.674908 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.674886 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:35:11.674908 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.674896 2576 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:35:11.674908 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.674906 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:35:11.676234 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.676221 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:11.676280 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.676241 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:11.679347 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.679327 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:35:11.680767 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.680753 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:35:11.681691 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681678 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681696 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681702 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681707 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681714 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681720 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681726 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:35:11.681733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681732 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:35:11.681922 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681740 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:35:11.681922 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681747 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:35:11.681922 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681760 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:35:11.682005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.681999 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:35:11.683989 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.683971 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:35:11.684079 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.683996 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:35:11.687067 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.687040 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-162.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:35:11.687144 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.687090 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-162.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:35:11.687144 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.687128 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:35:11.688160 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.688145 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:35:11.688199 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.688192 2576 server.go:1295] "Started kubelet" Apr 21 15:35:11.688339 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.688287 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:35:11.688434 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.688317 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:35:11.688434 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.688391 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:35:11.689221 ip-10-0-136-162 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:35:11.689731 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.689576 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:35:11.690830 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.690811 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:35:11.695712 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.695692 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:11.696404 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.696386 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:35:11.696520 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.696456 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:35:11.697182 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697159 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:35:11.697182 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.697171 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:11.697331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697161 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:35:11.697331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697200 2576 factory.go:55] Registering systemd factory Apr 21 15:35:11.697331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697215 2576 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:35:11.697331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697202 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:35:11.697512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697371 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:35:11.697512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697380 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:35:11.697512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697426 2576 factory.go:153] Registering CRI-O factory Apr 21 15:35:11.697512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697437 2576 factory.go:223] Registration of the crio container factory successfully Apr 21 15:35:11.697512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697480 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:35:11.697512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697515 2576 factory.go:103] Registering Raw factory Apr 21 15:35:11.697785 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697527 2576 manager.go:1196] Started watching for new ooms in manager Apr 21 15:35:11.697899 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.697887 2576 manager.go:319] Starting recovery of all containers Apr 21 15:35:11.704286 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.704056 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 15:35:11.704286 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.704072 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-162.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 15:35:11.705218 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.704110 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-162.ec2.internal.18a86930e3299774 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-162.ec2.internal,UID:ip-10-0-136-162.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-162.ec2.internal,},FirstTimestamp:2026-04-21 15:35:11.68816114 +0000 UTC m=+0.406238478,LastTimestamp:2026-04-21 15:35:11.68816114 +0000 UTC m=+0.406238478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-162.ec2.internal,}" Apr 21 15:35:11.707703 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.707688 2576 manager.go:324] Recovery completed Apr 21 15:35:11.709393 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.709317 2576 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 21 15:35:11.713196 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.713181 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:11.715699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.715665 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:11.715775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.715714 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:11.715775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.715725 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:11.716296 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.716281 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:35:11.716339 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.716296 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:35:11.716339 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.716312 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:11.717982 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.717911 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-162.ec2.internal.18a86930e4cdcc86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-162.ec2.internal,UID:ip-10-0-136-162.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-162.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-162.ec2.internal,},FirstTimestamp:2026-04-21 15:35:11.715699846 +0000 UTC m=+0.433777191,LastTimestamp:2026-04-21 15:35:11.715699846 +0000 UTC m=+0.433777191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-162.ec2.internal,}" Apr 21 15:35:11.718873 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.718861 2576 policy_none.go:49] "None policy: Start" Apr 21 15:35:11.718913 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.718878 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:35:11.718913 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.718889 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:35:11.723963 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.723944 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bvdgx" Apr 21 15:35:11.727397 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.727333 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-162.ec2.internal.18a86930e4ce1abd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-162.ec2.internal,UID:ip-10-0-136-162.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-162.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-162.ec2.internal,},FirstTimestamp:2026-04-21 15:35:11.715719869 +0000 UTC m=+0.433797207,LastTimestamp:2026-04-21 15:35:11.715719869 +0000 UTC m=+0.433797207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-162.ec2.internal,}" Apr 21 15:35:11.733216 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.733197 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bvdgx" Apr 21 15:35:11.769298 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769274 2576 manager.go:341] "Starting Device Plugin manager" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.769310 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769322 2576 server.go:85] "Starting device plugin registration server" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769609 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769620 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769738 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769813 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.769821 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.770269 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:35:11.770525 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.770307 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:11.795467 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.795433 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:35:11.796664 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.796640 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:35:11.796768 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.796669 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:35:11.796768 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.796689 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:35:11.796768 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.796695 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:35:11.796768 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.796729 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:35:11.800352 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.800327 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:11.870849 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.870770 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:11.872064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.872047 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:11.872142 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.872081 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:11.872142 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.872094 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:11.872142 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.872119 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-162.ec2.internal" Apr 21 15:35:11.882874 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.882851 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-162.ec2.internal" Apr 21 15:35:11.882984 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.882877 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-162.ec2.internal\": node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:11.897802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.897774 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal"] Apr 21 15:35:11.897907 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.897852 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:11.898863 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.898845 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:11.898946 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.898874 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:11.898946 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.898886 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:11.900255 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.900241 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:11.900417 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.900404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:11.900483 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.900433 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:11.900734 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.900717 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:11.901015 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.901002 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:11.901084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.901004 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:11.901084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.901056 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:11.901084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.901076 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:11.901084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.901033 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:11.901211 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.901132 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:11.902248 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.902229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" Apr 21 15:35:11.902348 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.902254 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:11.902888 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.902868 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:11.902964 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.902904 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:11.902964 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:11.902920 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:11.918101 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.918079 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-162.ec2.internal\" not found" node="ip-10-0-136-162.ec2.internal" Apr 21 15:35:11.922087 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:11.922070 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-162.ec2.internal\" not found" node="ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.001407 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.001375 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.099181 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.099152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5305e4e7d26391430f02122ff2e4e388-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal\" (UID: \"5305e4e7d26391430f02122ff2e4e388\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.099181 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.099185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5305e4e7d26391430f02122ff2e4e388-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal\" (UID: \"5305e4e7d26391430f02122ff2e4e388\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.099380 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.099203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2ad89fa060fba0be0166e35605272b1a-config\") pod \"kube-apiserver-proxy-ip-10-0-136-162.ec2.internal\" (UID: \"2ad89fa060fba0be0166e35605272b1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.102224 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.102194 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.200010 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.199910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2ad89fa060fba0be0166e35605272b1a-config\") pod \"kube-apiserver-proxy-ip-10-0-136-162.ec2.internal\" (UID: \"2ad89fa060fba0be0166e35605272b1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.200010 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.199963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5305e4e7d26391430f02122ff2e4e388-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal\" (UID: \"5305e4e7d26391430f02122ff2e4e388\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.200010 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.199982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5305e4e7d26391430f02122ff2e4e388-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal\" (UID: \"5305e4e7d26391430f02122ff2e4e388\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.200210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.200025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5305e4e7d26391430f02122ff2e4e388-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal\" (UID: \"5305e4e7d26391430f02122ff2e4e388\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.200210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.200026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2ad89fa060fba0be0166e35605272b1a-config\") pod \"kube-apiserver-proxy-ip-10-0-136-162.ec2.internal\" (UID: \"2ad89fa060fba0be0166e35605272b1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.200210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.200043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5305e4e7d26391430f02122ff2e4e388-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal\" (UID: \"5305e4e7d26391430f02122ff2e4e388\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.203001 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.202983 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.220203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.220174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.225143 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.225121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.303191 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.303155 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.403688 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.403649 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.504312 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.504215 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.548150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.548122 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:12.603854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.603817 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:35:12.604460 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.603972 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:12.604460 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.604000 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:12.604891 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.604873 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-162.ec2.internal\" not found" Apr 21 15:35:12.633366 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.633338 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:12.675331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.675284 2576 apiserver.go:52] "Watching apiserver" Apr 21 15:35:12.685847 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.685812 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:35:12.686959 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.686927 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-t694t","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5","openshift-network-operator/iptables-alerter-mcm77","openshift-ovn-kubernetes/ovnkube-node-f2fxv","kube-system/konnectivity-agent-7vpp5","openshift-cluster-node-tuning-operator/tuned-szbrh","openshift-image-registry/node-ca-b59t6","openshift-multus/multus-7d5rq","openshift-multus/multus-additional-cni-plugins-kmb6k","openshift-multus/network-metrics-daemon-sq5ln"] Apr 21 15:35:12.690237 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.690209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:12.690339 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.690317 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:12.691357 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.691332 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.692584 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.692562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.695047 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jtnbc\"" Apr 21 15:35:12.695153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:35:12.695153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.695282 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.695442 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9brrf\"" Apr 21 15:35:12.695442 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695433 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:35:12.695595 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695507 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:35:12.695595 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695524 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:35:12.695801 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695767 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:12.695903 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:35:12.695965 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.695914 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:35:12.696546 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.696474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.696670 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.696655 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.697894 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.697878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.699150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.699129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.700473 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.700457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.701688 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.701665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:12.701786 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.701725 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:12.701786 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.701747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-62wlx\"" Apr 21 15:35:12.701879 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.701862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:35:12.701924 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.701909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:35:12.701980 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.701914 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:35:12.702030 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.702015 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:35:12.702065 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.702034 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:35:12.702263 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.702249 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:12.702440 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.702429 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:12.702579 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.702565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mlqqg\"" Apr 21 15:35:12.703499 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mdsh5\"" Apr 21 15:35:12.703627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.703627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703532 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:35:12.703627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703554 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:35:12.703627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/106889c2-d807-4d30-b424-f6c840d5717b-ovn-node-metrics-cert\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.703627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dde473f9-f7fd-43ae-b044-2347e8649fee-agent-certs\") pod \"konnectivity-agent-7vpp5\" (UID: \"dde473f9-f7fd-43ae-b044-2347e8649fee\") " pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.703627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703630 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703752 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703763 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703652 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97b39e40-65e5-492e-a61c-fbdc3987eeb5-cni-binary-copy\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703710 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-csxqg\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703710 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703655 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-socket-dir-parent\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.703896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmfj\" (UniqueName: \"kubernetes.io/projected/97b39e40-65e5-492e-a61c-fbdc3987eeb5-kube-api-access-wwmfj\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-ovn\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-hostroot\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.703988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2vp\" (UniqueName: \"kubernetes.io/projected/59879dbb-8503-4959-ab94-2e0c86e91885-kube-api-access-rm2vp\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-device-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/59879dbb-8503-4959-ab94-2e0c86e91885-iptables-alerter-script\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-env-overrides\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704044 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-sys-fs\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-cni-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-conf-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-node-log\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-netns\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.704377 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704203 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pp4l5\"" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-kubelet\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-socket-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-etc-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-log-socket\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2726m\" (UniqueName: \"kubernetes.io/projected/106889c2-d807-4d30-b424-f6c840d5717b-kube-api-access-2726m\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gc47\" (UniqueName: \"kubernetes.io/projected/eb648a31-68d5-41f6-8194-806717864579-kube-api-access-4gc47\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-registration-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-daemon-config\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-cnibin\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704475 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59879dbb-8503-4959-ab94-2e0c86e91885-host-slash\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-var-lib-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-cni-netd\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-ovnkube-script-lib\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.704685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-cnibin\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-k8s-cni-cncf-io\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-multus-certs\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-ovnkube-config\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-kubelet\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-slash\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-os-release\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-cni-multus\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-system-cni-dir\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r52t6\" (UniqueName: \"kubernetes.io/projected/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-kube-api-access-r52t6\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-system-cni-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-systemd\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-cni-bin\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-os-release\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.705773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.706362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-systemd-units\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.706362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dde473f9-f7fd-43ae-b044-2347e8649fee-konnectivity-ca\") pod \"konnectivity-agent-7vpp5\" (UID: \"dde473f9-f7fd-43ae-b044-2347e8649fee\") " pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.706362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.706362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-run-netns\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.706362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-cni-bin\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.706362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.705802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-etc-kubernetes\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.707898 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.707881 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-p5zgv\"" Apr 21 15:35:12.733273 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.733247 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:12.733771 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.733751 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:30:11 +0000 UTC" deadline="2028-01-09 09:50:28.5602103 +0000 UTC" Apr 21 15:35:12.733836 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.733772 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15066h15m15.826441798s" Apr 21 15:35:12.735127 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.735046 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:12.735347 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.735332 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" Apr 21 15:35:12.735980 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.735943 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal"] Apr 21 15:35:12.755311 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:12.755235 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ad89fa060fba0be0166e35605272b1a.slice/crio-203dec5bea2fecd8efc3629888965b3b09f48c530ccace7a09d5c34442b1ec13 WatchSource:0}: Error finding container 203dec5bea2fecd8efc3629888965b3b09f48c530ccace7a09d5c34442b1ec13: Status 404 returned error can't find the container with id 203dec5bea2fecd8efc3629888965b3b09f48c530ccace7a09d5c34442b1ec13 Apr 21 15:35:12.755571 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:12.755548 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5305e4e7d26391430f02122ff2e4e388.slice/crio-3dcfc38cb31d8500dc2e71a92f404fdedb9d83432f27a1469a5d49bad9625f50 WatchSource:0}: Error finding container 3dcfc38cb31d8500dc2e71a92f404fdedb9d83432f27a1469a5d49bad9625f50: Status 404 returned error can't find the container with id 3dcfc38cb31d8500dc2e71a92f404fdedb9d83432f27a1469a5d49bad9625f50 Apr 21 15:35:12.759405 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.759388 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:12.759787 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.759761 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal"] Apr 21 15:35:12.760580 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.760565 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:35:12.793942 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.793915 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xqngg" Apr 21 15:35:12.798773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.798748 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:35:12.799965 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.799928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" event={"ID":"2ad89fa060fba0be0166e35605272b1a","Type":"ContainerStarted","Data":"203dec5bea2fecd8efc3629888965b3b09f48c530ccace7a09d5c34442b1ec13"} Apr 21 15:35:12.800842 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.800822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" event={"ID":"5305e4e7d26391430f02122ff2e4e388","Type":"ContainerStarted","Data":"3dcfc38cb31d8500dc2e71a92f404fdedb9d83432f27a1469a5d49bad9625f50"} Apr 21 15:35:12.806083 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-cni-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806132 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-conf-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806132 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-node-log\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806132 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c946474c-a71a-4d12-a61d-2ca8c7e36fba-tmp\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-host\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnl7f\" (UniqueName: \"kubernetes.io/projected/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-kube-api-access-vnl7f\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-conf-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-netns\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-node-log\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-netns\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-kubelet\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-cni-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-socket-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-kubelet\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-etc-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-log-socket\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2726m\" (UniqueName: \"kubernetes.io/projected/106889c2-d807-4d30-b424-f6c840d5717b-kube-api-access-2726m\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-socket-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-etc-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-sys\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gc47\" (UniqueName: \"kubernetes.io/projected/eb648a31-68d5-41f6-8194-806717864579-kube-api-access-4gc47\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-log-socket\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-registration-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-modprobe-d\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysctl-conf\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-registration-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.806644 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-var-lib-kubelet\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-daemon-config\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-cnibin\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59879dbb-8503-4959-ab94-2e0c86e91885-host-slash\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-var-lib-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-cni-netd\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-ovnkube-script-lib\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-cnibin\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-k8s-cni-cncf-io\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806838 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xqngg" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-cni-netd\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-multus-certs\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-multus-certs\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-ovnkube-config\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-cnibin\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-lib-modules\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-run-k8s-cni-cncf-io\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.807410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7knq\" (UniqueName: \"kubernetes.io/projected/c946474c-a71a-4d12-a61d-2ca8c7e36fba-kube-api-access-j7knq\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-kubelet\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.806952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-cnibin\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-var-lib-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59879dbb-8503-4959-ab94-2e0c86e91885-host-slash\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-slash\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-kubelet\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-slash\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-os-release\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-cni-multus\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-os-release\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-cni-multus\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-system-cni-dir\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-system-cni-dir\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807390 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r52t6\" (UniqueName: \"kubernetes.io/projected/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-kube-api-access-r52t6\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-kubernetes\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.808159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-system-cni-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-systemd\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-systemd\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-system-cni-dir\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-ovnkube-script-lib\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-ovnkube-config\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysctl-d\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-cni-bin\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-os-release\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-host-var-lib-cni-bin\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-systemd-units\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb648a31-68d5-41f6-8194-806717864579-os-release\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dde473f9-f7fd-43ae-b044-2347e8649fee-konnectivity-ca\") pod \"konnectivity-agent-7vpp5\" (UID: \"dde473f9-f7fd-43ae-b044-2347e8649fee\") " pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.808905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-systemd-units\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-run-netns\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-cni-bin\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-tuned\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.807996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-run-netns\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-etc-kubernetes\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/106889c2-d807-4d30-b424-f6c840d5717b-ovn-node-metrics-cert\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-etc-kubernetes\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-openvswitch\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-cni-bin\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-daemon-config\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dde473f9-f7fd-43ae-b044-2347e8649fee-agent-certs\") pod \"konnectivity-agent-7vpp5\" (UID: \"dde473f9-f7fd-43ae-b044-2347e8649fee\") " pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-systemd\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.809421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-host\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97b39e40-65e5-492e-a61c-fbdc3987eeb5-cni-binary-copy\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-socket-dir-parent\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-multus-socket-dir-parent\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmfj\" (UniqueName: \"kubernetes.io/projected/97b39e40-65e5-492e-a61c-fbdc3987eeb5-kube-api-access-wwmfj\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dde473f9-f7fd-43ae-b044-2347e8649fee-konnectivity-ca\") pod \"konnectivity-agent-7vpp5\" (UID: \"dde473f9-f7fd-43ae-b044-2347e8649fee\") " pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-ovn\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808403 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-run-ovn\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm46l\" (UniqueName: \"kubernetes.io/projected/8b870a2e-b786-497a-8ee3-57668a43f22d-kube-api-access-vm46l\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106889c2-d807-4d30-b424-f6c840d5717b-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-hostroot\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97b39e40-65e5-492e-a61c-fbdc3987eeb5-hostroot\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2vp\" (UniqueName: \"kubernetes.io/projected/59879dbb-8503-4959-ab94-2e0c86e91885-kube-api-access-rm2vp\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.809931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-device-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/59879dbb-8503-4959-ab94-2e0c86e91885-iptables-alerter-script\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-env-overrides\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-run\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-device-dir\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-serviceca\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-sys-fs\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysconfig\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97b39e40-65e5-492e-a61c-fbdc3987eeb5-cni-binary-copy\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-sys-fs\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb648a31-68d5-41f6-8194-806717864579-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.808873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.809129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/59879dbb-8503-4959-ab94-2e0c86e91885-iptables-alerter-script\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.810418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.809146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/106889c2-d807-4d30-b424-f6c840d5717b-env-overrides\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.811324 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.811306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/106889c2-d807-4d30-b424-f6c840d5717b-ovn-node-metrics-cert\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.811461 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.811444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dde473f9-f7fd-43ae-b044-2347e8649fee-agent-certs\") pod \"konnectivity-agent-7vpp5\" (UID: \"dde473f9-f7fd-43ae-b044-2347e8649fee\") " pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:12.827952 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.827919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmfj\" (UniqueName: \"kubernetes.io/projected/97b39e40-65e5-492e-a61c-fbdc3987eeb5-kube-api-access-wwmfj\") pod \"multus-7d5rq\" (UID: \"97b39e40-65e5-492e-a61c-fbdc3987eeb5\") " pod="openshift-multus/multus-7d5rq" Apr 21 15:35:12.828523 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.828467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2726m\" (UniqueName: \"kubernetes.io/projected/106889c2-d807-4d30-b424-f6c840d5717b-kube-api-access-2726m\") pod \"ovnkube-node-f2fxv\" (UID: \"106889c2-d807-4d30-b424-f6c840d5717b\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:12.828761 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.828745 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:12.828814 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.828764 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:12.828814 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.828774 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cvlpn for pod openshift-network-diagnostics/network-check-target-t694t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:12.828878 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.828840 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn podName:04b20ea4-ea35-461d-8228-945315c5c4e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:13.328818855 +0000 UTC m=+2.046896201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cvlpn" (UniqueName: "kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn") pod "network-check-target-t694t" (UID: "04b20ea4-ea35-461d-8228-945315c5c4e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:12.830187 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.830169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r52t6\" (UniqueName: \"kubernetes.io/projected/6b3181d5-d024-4897-9f9e-8d78e9d97e4b-kube-api-access-r52t6\") pod \"aws-ebs-csi-driver-node-f5nj5\" (UID: \"6b3181d5-d024-4897-9f9e-8d78e9d97e4b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:12.830252 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.830240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2vp\" (UniqueName: \"kubernetes.io/projected/59879dbb-8503-4959-ab94-2e0c86e91885-kube-api-access-rm2vp\") pod \"iptables-alerter-mcm77\" (UID: \"59879dbb-8503-4959-ab94-2e0c86e91885\") " pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:12.840371 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.840343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gc47\" (UniqueName: \"kubernetes.io/projected/eb648a31-68d5-41f6-8194-806717864579-kube-api-access-4gc47\") pod \"multus-additional-cni-plugins-kmb6k\" (UID: \"eb648a31-68d5-41f6-8194-806717864579\") " pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:12.909448 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c946474c-a71a-4d12-a61d-2ca8c7e36fba-tmp\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.909448 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-host\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnl7f\" (UniqueName: \"kubernetes.io/projected/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-kube-api-access-vnl7f\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-host\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-sys\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-modprobe-d\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysctl-conf\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-var-lib-kubelet\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-sys\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.909693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-lib-modules\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-var-lib-kubelet\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7knq\" (UniqueName: \"kubernetes.io/projected/c946474c-a71a-4d12-a61d-2ca8c7e36fba-kube-api-access-j7knq\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-modprobe-d\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysctl-conf\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-kubernetes\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-lib-modules\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysctl-d\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-kubernetes\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-tuned\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.909998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-systemd\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-host\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysctl-d\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vm46l\" (UniqueName: \"kubernetes.io/projected/8b870a2e-b786-497a-8ee3-57668a43f22d-kube-api-access-vm46l\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-run\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-host\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.910088 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-run\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-serviceca\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:12.910155 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:13.410135992 +0000 UTC m=+2.128213337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-systemd\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysconfig\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-sysconfig\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.910514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.910463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-serviceca\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.911811 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.911790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c946474c-a71a-4d12-a61d-2ca8c7e36fba-tmp\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.911954 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.911939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c946474c-a71a-4d12-a61d-2ca8c7e36fba-etc-tuned\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.923361 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.923340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7knq\" (UniqueName: \"kubernetes.io/projected/c946474c-a71a-4d12-a61d-2ca8c7e36fba-kube-api-access-j7knq\") pod \"tuned-szbrh\" (UID: \"c946474c-a71a-4d12-a61d-2ca8c7e36fba\") " pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:12.926994 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.925183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm46l\" (UniqueName: \"kubernetes.io/projected/8b870a2e-b786-497a-8ee3-57668a43f22d-kube-api-access-vm46l\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:12.931320 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.931300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnl7f\" (UniqueName: \"kubernetes.io/projected/785c95fa-5c55-4ea7-8b31-adcc8f22c2e2-kube-api-access-vnl7f\") pod \"node-ca-b59t6\" (UID: \"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2\") " pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:12.964012 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:12.963986 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:13.012804 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.012720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:13.019119 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.019098 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde473f9_f7fd_43ae_b044_2347e8649fee.slice/crio-fe97114df80eaadce1b96b217d5241088f4ae092d90d0b6fcd40bec13232c4a3 WatchSource:0}: Error finding container fe97114df80eaadce1b96b217d5241088f4ae092d90d0b6fcd40bec13232c4a3: Status 404 returned error can't find the container with id fe97114df80eaadce1b96b217d5241088f4ae092d90d0b6fcd40bec13232c4a3 Apr 21 15:35:13.029990 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.029968 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7d5rq" Apr 21 15:35:13.036076 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.036051 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b39e40_65e5_492e_a61c_fbdc3987eeb5.slice/crio-52d164cd6eb91168d73ac98aeda4619c428e38f737c7af7292dc65d8f3ff0b36 WatchSource:0}: Error finding container 52d164cd6eb91168d73ac98aeda4619c428e38f737c7af7292dc65d8f3ff0b36: Status 404 returned error can't find the container with id 52d164cd6eb91168d73ac98aeda4619c428e38f737c7af7292dc65d8f3ff0b36 Apr 21 15:35:13.051503 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.051464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" Apr 21 15:35:13.057815 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.057785 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb648a31_68d5_41f6_8194_806717864579.slice/crio-cd38f45ca1d4b09369f11727d25ebce56f81d93101e11b01e5222f3bc3ede527 WatchSource:0}: Error finding container cd38f45ca1d4b09369f11727d25ebce56f81d93101e11b01e5222f3bc3ede527: Status 404 returned error can't find the container with id cd38f45ca1d4b09369f11727d25ebce56f81d93101e11b01e5222f3bc3ede527 Apr 21 15:35:13.058407 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.058389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" Apr 21 15:35:13.064905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.064884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mcm77" Apr 21 15:35:13.065196 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.065160 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b3181d5_d024_4897_9f9e_8d78e9d97e4b.slice/crio-cbf0fce8e541dfe739b70a88bfc14220a1a54e41c4c8042c6f6da3768502d2e5 WatchSource:0}: Error finding container cbf0fce8e541dfe739b70a88bfc14220a1a54e41c4c8042c6f6da3768502d2e5: Status 404 returned error can't find the container with id cbf0fce8e541dfe739b70a88bfc14220a1a54e41c4c8042c6f6da3768502d2e5 Apr 21 15:35:13.070898 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.070876 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:13.071104 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.071077 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59879dbb_8503_4959_ab94_2e0c86e91885.slice/crio-3c245055954dd2acfaa12f2bc229e5bc8bb92d40d805fc68f5a32e5586555ba7 WatchSource:0}: Error finding container 3c245055954dd2acfaa12f2bc229e5bc8bb92d40d805fc68f5a32e5586555ba7: Status 404 returned error can't find the container with id 3c245055954dd2acfaa12f2bc229e5bc8bb92d40d805fc68f5a32e5586555ba7 Apr 21 15:35:13.076469 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.076449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-szbrh" Apr 21 15:35:13.077079 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.077051 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod106889c2_d807_4d30_b424_f6c840d5717b.slice/crio-e52f7269e510e23abbce6730e9631703f57e11a33bc5eb2c2205e92503fff1e9 WatchSource:0}: Error finding container e52f7269e510e23abbce6730e9631703f57e11a33bc5eb2c2205e92503fff1e9: Status 404 returned error can't find the container with id e52f7269e510e23abbce6730e9631703f57e11a33bc5eb2c2205e92503fff1e9 Apr 21 15:35:13.080634 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.080617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b59t6" Apr 21 15:35:13.084531 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.084505 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc946474c_a71a_4d12_a61d_2ca8c7e36fba.slice/crio-f19f9bac6e3ce801ba5e6522737bb4aa92c66140068a7bf9deb7c38b30c1370b WatchSource:0}: Error finding container f19f9bac6e3ce801ba5e6522737bb4aa92c66140068a7bf9deb7c38b30c1370b: Status 404 returned error can't find the container with id f19f9bac6e3ce801ba5e6522737bb4aa92c66140068a7bf9deb7c38b30c1370b Apr 21 15:35:13.087534 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:13.087509 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785c95fa_5c55_4ea7_8b31_adcc8f22c2e2.slice/crio-7a02ae727e1c9bece859dd4941231d6521aa0e781cdf489fb3f63af9de3bee3c WatchSource:0}: Error finding container 7a02ae727e1c9bece859dd4941231d6521aa0e781cdf489fb3f63af9de3bee3c: Status 404 returned error can't find the container with id 7a02ae727e1c9bece859dd4941231d6521aa0e781cdf489fb3f63af9de3bee3c Apr 21 15:35:13.413835 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.413746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:13.414000 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.413835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:13.414000 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.413975 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:13.414000 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.413996 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:13.414171 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.414009 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cvlpn for pod openshift-network-diagnostics/network-check-target-t694t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:13.414171 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.414069 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn podName:04b20ea4-ea35-461d-8228-945315c5c4e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:14.414050064 +0000 UTC m=+3.132127397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvlpn" (UniqueName: "kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn") pod "network-check-target-t694t" (UID: "04b20ea4-ea35-461d-8228-945315c5c4e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:13.414502 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.414471 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:13.414631 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.414548 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:14.414530626 +0000 UTC m=+3.132607955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:13.798766 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.798685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:13.799548 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:13.798832 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:13.807995 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.807900 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:12 +0000 UTC" deadline="2027-12-23 21:29:22.699390705 +0000 UTC" Apr 21 15:35:13.807995 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.807943 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14669h54m8.891452307s" Apr 21 15:35:13.819292 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.819202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b59t6" event={"ID":"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2","Type":"ContainerStarted","Data":"7a02ae727e1c9bece859dd4941231d6521aa0e781cdf489fb3f63af9de3bee3c"} Apr 21 15:35:13.824855 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.824813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-szbrh" event={"ID":"c946474c-a71a-4d12-a61d-2ca8c7e36fba","Type":"ContainerStarted","Data":"f19f9bac6e3ce801ba5e6522737bb4aa92c66140068a7bf9deb7c38b30c1370b"} Apr 21 15:35:13.833233 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.833196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"e52f7269e510e23abbce6730e9631703f57e11a33bc5eb2c2205e92503fff1e9"} Apr 21 15:35:13.836116 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.836086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mcm77" event={"ID":"59879dbb-8503-4959-ab94-2e0c86e91885","Type":"ContainerStarted","Data":"3c245055954dd2acfaa12f2bc229e5bc8bb92d40d805fc68f5a32e5586555ba7"} Apr 21 15:35:13.844672 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.844640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" event={"ID":"6b3181d5-d024-4897-9f9e-8d78e9d97e4b","Type":"ContainerStarted","Data":"cbf0fce8e541dfe739b70a88bfc14220a1a54e41c4c8042c6f6da3768502d2e5"} Apr 21 15:35:13.856374 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.856331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7vpp5" event={"ID":"dde473f9-f7fd-43ae-b044-2347e8649fee","Type":"ContainerStarted","Data":"fe97114df80eaadce1b96b217d5241088f4ae092d90d0b6fcd40bec13232c4a3"} Apr 21 15:35:13.865236 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.865207 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:13.867180 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.867146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerStarted","Data":"cd38f45ca1d4b09369f11727d25ebce56f81d93101e11b01e5222f3bc3ede527"} Apr 21 15:35:13.875958 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:13.875927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7d5rq" event={"ID":"97b39e40-65e5-492e-a61c-fbdc3987eeb5","Type":"ContainerStarted","Data":"52d164cd6eb91168d73ac98aeda4619c428e38f737c7af7292dc65d8f3ff0b36"} Apr 21 15:35:14.274670 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:14.274581 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:14.421620 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:14.421580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:14.421824 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:14.421654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:14.421824 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.421804 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:14.421951 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.421871 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:16.421852094 +0000 UTC m=+5.139929423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:14.422346 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.422320 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:14.422346 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.422345 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:14.422535 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.422359 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cvlpn for pod openshift-network-diagnostics/network-check-target-t694t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:14.422535 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.422409 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn podName:04b20ea4-ea35-461d-8228-945315c5c4e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:16.422393084 +0000 UTC m=+5.140470411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvlpn" (UniqueName: "kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn") pod "network-check-target-t694t" (UID: "04b20ea4-ea35-461d-8228-945315c5c4e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:14.797531 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:14.797461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:14.797735 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:14.797610 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:14.809009 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:14.808971 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:12 +0000 UTC" deadline="2027-10-04 17:57:44.931081892 +0000 UTC" Apr 21 15:35:14.809009 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:14.809004 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12746h22m30.122081831s" Apr 21 15:35:15.799740 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:15.799476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:15.799740 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:15.799685 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:16.438448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:16.438524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.438625 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.438641 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.438659 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.438672 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cvlpn for pod openshift-network-diagnostics/network-check-target-t694t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.438695 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:20.438675808 +0000 UTC m=+9.156753137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:16.439125 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.438712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn podName:04b20ea4-ea35-461d-8228-945315c5c4e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:20.438703698 +0000 UTC m=+9.156781025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvlpn" (UniqueName: "kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn") pod "network-check-target-t694t" (UID: "04b20ea4-ea35-461d-8228-945315c5c4e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:16.797128 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:16.797037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:16.797303 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:16.797177 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:17.797036 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:17.797001 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:17.797586 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:17.797157 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:18.797809 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:18.797758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:18.798332 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:18.797914 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:19.497163 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.496378 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6f2r4"] Apr 21 15:35:19.500484 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.500406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.500661 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:19.500510 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:19.566348 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.566300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.566534 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.566425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3a193bd2-d4b3-409b-a943-668e1838d610-kubelet-config\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.566534 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.566472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3a193bd2-d4b3-409b-a943-668e1838d610-dbus\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.667633 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.667595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.667807 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.667705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3a193bd2-d4b3-409b-a943-668e1838d610-kubelet-config\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.667807 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.667743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3a193bd2-d4b3-409b-a943-668e1838d610-dbus\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.667932 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:19.667852 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:19.667932 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.667864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3a193bd2-d4b3-409b-a943-668e1838d610-kubelet-config\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.668038 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.667930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3a193bd2-d4b3-409b-a943-668e1838d610-dbus\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:19.668038 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:19.667943 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret podName:3a193bd2-d4b3-409b-a943-668e1838d610 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:20.167923536 +0000 UTC m=+8.886000879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret") pod "global-pull-secret-syncer-6f2r4" (UID: "3a193bd2-d4b3-409b-a943-668e1838d610") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:19.797438 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:19.797210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:19.797438 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:19.797356 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:20.172358 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:20.172162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:20.172358 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.172279 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:20.172358 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.172351 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret podName:3a193bd2-d4b3-409b-a943-668e1838d610 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:21.172335435 +0000 UTC m=+9.890412761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret") pod "global-pull-secret-syncer-6f2r4" (UID: "3a193bd2-d4b3-409b-a943-668e1838d610") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:20.475256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:20.475351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.475533 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.475612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:28.475590426 +0000 UTC m=+17.193667753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.475660 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.475687 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.475703 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cvlpn for pod openshift-network-diagnostics/network-check-target-t694t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:20.476014 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.475756 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn podName:04b20ea4-ea35-461d-8228-945315c5c4e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:28.475738801 +0000 UTC m=+17.193816132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvlpn" (UniqueName: "kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn") pod "network-check-target-t694t" (UID: "04b20ea4-ea35-461d-8228-945315c5c4e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:20.799289 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:20.798933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:20.799289 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:20.798990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:20.800327 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.799577 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:20.800327 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:20.799630 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:21.182721 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:21.182677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:21.183157 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:21.182838 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:21.183157 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:21.182920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret podName:3a193bd2-d4b3-409b-a943-668e1838d610 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:23.182901019 +0000 UTC m=+11.900978347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret") pod "global-pull-secret-syncer-6f2r4" (UID: "3a193bd2-d4b3-409b-a943-668e1838d610") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:21.798449 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:21.798418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:21.798637 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:21.798554 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:22.797270 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:22.797235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:22.797735 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:22.797359 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:22.797735 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:22.797414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:22.797735 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:22.797531 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:23.198098 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:23.198048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:23.198298 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:23.198223 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:23.198356 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:23.198298 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret podName:3a193bd2-d4b3-409b-a943-668e1838d610 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.198280081 +0000 UTC m=+15.916357409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret") pod "global-pull-secret-syncer-6f2r4" (UID: "3a193bd2-d4b3-409b-a943-668e1838d610") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:23.797238 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:23.797196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:23.797566 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:23.797348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:24.797546 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:24.797510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:24.797738 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:24.797510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:24.797738 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:24.797630 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:24.798151 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:24.797752 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:25.797332 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:25.797287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:25.797546 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:25.797426 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:26.797654 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:26.797620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:26.798132 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:26.797620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:26.798132 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:26.797748 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:26.798132 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:26.797815 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:27.223532 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:27.223412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:27.223686 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:27.223568 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:27.223686 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:27.223645 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret podName:3a193bd2-d4b3-409b-a943-668e1838d610 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:35.223628143 +0000 UTC m=+23.941705472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret") pod "global-pull-secret-syncer-6f2r4" (UID: "3a193bd2-d4b3-409b-a943-668e1838d610") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:27.797941 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:27.797898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:27.798355 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:27.798036 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:28.532041 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:28.532005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:28.532041 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:28.532058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:28.532332 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.532182 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:28.532332 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.532182 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:28.532332 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.532270 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:44.532247552 +0000 UTC m=+33.250324889 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:28.532332 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.532196 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:28.532332 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.532313 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cvlpn for pod openshift-network-diagnostics/network-check-target-t694t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:28.532576 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.532367 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn podName:04b20ea4-ea35-461d-8228-945315c5c4e9 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:44.532351287 +0000 UTC m=+33.250428617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvlpn" (UniqueName: "kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn") pod "network-check-target-t694t" (UID: "04b20ea4-ea35-461d-8228-945315c5c4e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:28.797474 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:28.797392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:28.797640 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:28.797397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:28.797640 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.797540 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:28.797747 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:28.797631 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:29.797307 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:29.797276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:29.797699 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:29.797428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:30.797044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:30.797018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:30.797156 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:30.797018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:30.797156 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:30.797122 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:30.797232 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:30.797204 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:31.797903 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.797805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:31.798704 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:31.797936 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:31.915406 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.915364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7d5rq" event={"ID":"97b39e40-65e5-492e-a61c-fbdc3987eeb5","Type":"ContainerStarted","Data":"36ee82eba356d3e30ffc4c753f838ed03ebe56e13f7ec0d7c4750c2904d35671"} Apr 21 15:35:31.920925 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.920860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" event={"ID":"2ad89fa060fba0be0166e35605272b1a","Type":"ContainerStarted","Data":"6029270cc45f7fbdb39305b130272ca8eabacb8f9df29a95a5dc4fe6a46fc53f"} Apr 21 15:35:31.927068 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.927007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-szbrh" event={"ID":"c946474c-a71a-4d12-a61d-2ca8c7e36fba","Type":"ContainerStarted","Data":"b603ba3a9eb7505792d96d69945028c10daf2318534041cf33cadbe3cf570c75"} Apr 21 15:35:31.931756 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.931733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"7f43eb57d455363e41d4db8f8617d197b5d6f1f635080a0e2b762973a15a217c"} Apr 21 15:35:31.931855 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.931767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"84bf908957a786818be043e9b7984170a2a6adf4ee759062c0336b80a4e3a572"} Apr 21 15:35:31.931855 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.931780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"d1d60e9d0aa42077eb3007a4280ae93b708f47cd9ada696d204c813f158b5f8a"} Apr 21 15:35:31.931855 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.931793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"ac65ba65ef2c4b14ee0d5944c85b97c5e9e74a6bb99108921e53805e80ab6669"} Apr 21 15:35:31.931855 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.931805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"e24c75959e9a494d5597b80ce0142699cda37ae26df91b0324058f0367be22de"} Apr 21 15:35:31.933662 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.933032 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7d5rq" podStartSLOduration=2.908327699 podStartE2EDuration="20.933016203s" podCreationTimestamp="2026-04-21 15:35:11 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.037475636 +0000 UTC m=+1.755552965" lastFinishedPulling="2026-04-21 15:35:31.06216413 +0000 UTC m=+19.780241469" observedRunningTime="2026-04-21 15:35:31.932807215 +0000 UTC m=+20.650884564" watchObservedRunningTime="2026-04-21 15:35:31.933016203 +0000 UTC m=+20.651093550" Apr 21 15:35:31.949845 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.949133 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-162.ec2.internal" podStartSLOduration=19.948968458 podStartE2EDuration="19.948968458s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:31.948019108 +0000 UTC m=+20.666096456" watchObservedRunningTime="2026-04-21 15:35:31.948968458 +0000 UTC m=+20.667045832" Apr 21 15:35:31.967188 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:31.967127 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-szbrh" podStartSLOduration=2.320977235 podStartE2EDuration="19.967105959s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.08610199 +0000 UTC m=+1.804179336" lastFinishedPulling="2026-04-21 15:35:30.732230719 +0000 UTC m=+19.450308060" observedRunningTime="2026-04-21 15:35:31.967028565 +0000 UTC m=+20.685105951" watchObservedRunningTime="2026-04-21 15:35:31.967105959 +0000 UTC m=+20.685183308" Apr 21 15:35:32.798129 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.797929 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:32.798550 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.797984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:32.798550 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:32.798214 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:32.798550 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:32.798278 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:32.936743 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.936705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"555319e702f2313797bafa149e3c54ac0cb46fea6d29d990e7ecc5d434753c48"} Apr 21 15:35:32.937937 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.937909 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mcm77" event={"ID":"59879dbb-8503-4959-ab94-2e0c86e91885","Type":"ContainerStarted","Data":"817ddb4986c53ce89e928de4d03d9110dbfd63caaec8bd849255b90355947b42"} Apr 21 15:35:32.939058 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.939037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" event={"ID":"6b3181d5-d024-4897-9f9e-8d78e9d97e4b","Type":"ContainerStarted","Data":"7ad1b129f5578803dbc4a15ecccc627a151b60989e3ca2e647dd6c7ef5f31868"} Apr 21 15:35:32.940102 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.940081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7vpp5" event={"ID":"dde473f9-f7fd-43ae-b044-2347e8649fee","Type":"ContainerStarted","Data":"1d99ee313321380a5c051410fef5ce5c7e7250aba0fbcb2f2f11e7adae9e5deb"} Apr 21 15:35:32.941239 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.941213 2576 generic.go:358] "Generic (PLEG): container finished" podID="5305e4e7d26391430f02122ff2e4e388" containerID="987704528b4e7c93b81094fb0d35a9eae477d725ebfc99ae8badcc17c5c7fbb3" exitCode=0 Apr 21 15:35:32.941317 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.941286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" event={"ID":"5305e4e7d26391430f02122ff2e4e388","Type":"ContainerDied","Data":"987704528b4e7c93b81094fb0d35a9eae477d725ebfc99ae8badcc17c5c7fbb3"} Apr 21 15:35:32.942608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.942586 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb648a31-68d5-41f6-8194-806717864579" containerID="3669f02e5cebd941786f5deb63d7e4eb7fa8883906a40a84ba01e51d3be194e1" exitCode=0 Apr 21 15:35:32.942685 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.942668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerDied","Data":"3669f02e5cebd941786f5deb63d7e4eb7fa8883906a40a84ba01e51d3be194e1"} Apr 21 15:35:32.944031 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.943990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b59t6" event={"ID":"785c95fa-5c55-4ea7-8b31-adcc8f22c2e2","Type":"ContainerStarted","Data":"09173fa2b81964ff78bbc4ca83acf50ee4e484a4dd84e388181dd79b86ee5037"} Apr 21 15:35:32.954167 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.954116 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mcm77" podStartSLOduration=3.296406587 podStartE2EDuration="20.954102405s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.07247157 +0000 UTC m=+1.790548896" lastFinishedPulling="2026-04-21 15:35:30.730167378 +0000 UTC m=+19.448244714" observedRunningTime="2026-04-21 15:35:32.953401522 +0000 UTC m=+21.671478870" watchObservedRunningTime="2026-04-21 15:35:32.954102405 +0000 UTC m=+21.672179753" Apr 21 15:35:32.985962 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.985914 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:35:32.989851 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:32.989804 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7vpp5" podStartSLOduration=4.280343335 podStartE2EDuration="21.989784629s" podCreationTimestamp="2026-04-21 15:35:11 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.020648784 +0000 UTC m=+1.738726110" lastFinishedPulling="2026-04-21 15:35:30.730090062 +0000 UTC m=+19.448167404" observedRunningTime="2026-04-21 15:35:32.969235769 +0000 UTC m=+21.687313130" watchObservedRunningTime="2026-04-21 15:35:32.989784629 +0000 UTC m=+21.707861978" Apr 21 15:35:33.009074 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.009009 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b59t6" podStartSLOduration=3.367901192 podStartE2EDuration="21.00898934s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.088903909 +0000 UTC m=+1.806981236" lastFinishedPulling="2026-04-21 15:35:30.729992043 +0000 UTC m=+19.448069384" observedRunningTime="2026-04-21 15:35:33.008279175 +0000 UTC m=+21.726356523" watchObservedRunningTime="2026-04-21 15:35:33.00898934 +0000 UTC m=+21.727066689" Apr 21 15:35:33.781707 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.781595 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:35:32.985938678Z","UUID":"eaaf44d4-dba8-4d17-8fed-eaa9381ed9ab","Handler":null,"Name":"","Endpoint":""} Apr 21 15:35:33.785008 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.784573 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:35:33.785008 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.784605 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:35:33.797435 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.797002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:33.797435 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:33.797138 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:33.948006 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.947953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" event={"ID":"6b3181d5-d024-4897-9f9e-8d78e9d97e4b","Type":"ContainerStarted","Data":"26a5a9c7b79a85579cade6a9e7c2fd69fab85d5a7f9031f7dd921a059ded40e4"} Apr 21 15:35:33.949939 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.949905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" event={"ID":"5305e4e7d26391430f02122ff2e4e388","Type":"ContainerStarted","Data":"f0d4836519da6907f24fa3d14cc6bf93c5b676e350547d0179d9e2f61b3c1721"} Apr 21 15:35:33.965200 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:33.965142 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-162.ec2.internal" podStartSLOduration=21.965120735 podStartE2EDuration="21.965120735s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:33.964955324 +0000 UTC m=+22.683032673" watchObservedRunningTime="2026-04-21 15:35:33.965120735 +0000 UTC m=+22.683198085" Apr 21 15:35:34.528519 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.528410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:34.529068 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.529050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:34.797034 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.796944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:34.797206 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.796945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:34.797206 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:34.797076 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:34.797206 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:34.797171 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:34.955035 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.954991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"4245e9270a0e5fd21fbbb92094efaad14f0993eb0794222f137262b1f56e8d86"} Apr 21 15:35:34.956829 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.956803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" event={"ID":"6b3181d5-d024-4897-9f9e-8d78e9d97e4b","Type":"ContainerStarted","Data":"f302b735b76e64d9057cf815a5b425014f23d6b84c628196dc5c956c3580ac18"} Apr 21 15:35:34.957284 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.957258 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:34.957760 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.957741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7vpp5" Apr 21 15:35:34.976092 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:34.976040 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5nj5" podStartSLOduration=2.134155824 podStartE2EDuration="22.976025882s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.067582163 +0000 UTC m=+1.785659489" lastFinishedPulling="2026-04-21 15:35:33.909452221 +0000 UTC m=+22.627529547" observedRunningTime="2026-04-21 15:35:34.975740301 +0000 UTC m=+23.693817649" watchObservedRunningTime="2026-04-21 15:35:34.976025882 +0000 UTC m=+23.694103227" Apr 21 15:35:35.280681 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:35.280595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:35.280831 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:35.280774 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:35.280881 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:35.280867 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret podName:3a193bd2-d4b3-409b-a943-668e1838d610 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:51.28084574 +0000 UTC m=+39.998923069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret") pod "global-pull-secret-syncer-6f2r4" (UID: "3a193bd2-d4b3-409b-a943-668e1838d610") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:35.797934 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:35.797902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:35.798128 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:35.798026 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:36.797100 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:36.797066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:36.797587 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:36.797071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:36.797587 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:36.797208 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:36.797587 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:36.797270 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:37.797796 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.797539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:37.798508 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:37.797850 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:37.965430 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.965394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" event={"ID":"106889c2-d807-4d30-b424-f6c840d5717b","Type":"ContainerStarted","Data":"a6176cd24e2dcf12b5228209f4672ae99cbfd72eb4d8da74ec406638cd2458a0"} Apr 21 15:35:37.965797 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.965775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:37.965797 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.965806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:37.967217 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.967196 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb648a31-68d5-41f6-8194-806717864579" containerID="f08799038a6d9b266456138d7ab85f8d1076910cd5a5dd095b0b63491076a4e1" exitCode=0 Apr 21 15:35:37.967314 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.967236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerDied","Data":"f08799038a6d9b266456138d7ab85f8d1076910cd5a5dd095b0b63491076a4e1"} Apr 21 15:35:37.985286 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.985252 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:37.997231 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:37.997186 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" podStartSLOduration=7.640292255 podStartE2EDuration="25.997174478s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.080723145 +0000 UTC m=+1.798800474" lastFinishedPulling="2026-04-21 15:35:31.437605357 +0000 UTC m=+20.155682697" observedRunningTime="2026-04-21 15:35:37.997084721 +0000 UTC m=+26.715162068" watchObservedRunningTime="2026-04-21 15:35:37.997174478 +0000 UTC m=+26.715251826" Apr 21 15:35:38.797467 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:38.797291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:38.797666 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:38.797319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:38.797666 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:38.797565 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:38.797666 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:38.797629 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:38.970699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:38.970624 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb648a31-68d5-41f6-8194-806717864579" containerID="31f744100fb53ce9a10865558396ccc1cfad8717cfa29355679edd71f342b395" exitCode=0 Apr 21 15:35:38.971045 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:38.970704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerDied","Data":"31f744100fb53ce9a10865558396ccc1cfad8717cfa29355679edd71f342b395"} Apr 21 15:35:38.971402 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:38.971333 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:38.985710 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:38.985681 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:35:39.160172 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.160140 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6f2r4"] Apr 21 15:35:39.160345 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.160228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:39.160345 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:39.160308 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:39.168851 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.168819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sq5ln"] Apr 21 15:35:39.168981 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.168945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:39.169063 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:39.169044 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:39.169622 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.169590 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t694t"] Apr 21 15:35:39.169714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.169672 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:39.169793 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:39.169764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:39.974938 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.974807 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb648a31-68d5-41f6-8194-806717864579" containerID="2fd7daff0627da25660998cba056dd47f92f0fc499c9fa08882448c838acac9c" exitCode=0 Apr 21 15:35:39.974938 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:39.974877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerDied","Data":"2fd7daff0627da25660998cba056dd47f92f0fc499c9fa08882448c838acac9c"} Apr 21 15:35:40.272181 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.272152 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wl2ck"] Apr 21 15:35:40.274943 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.274927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.277751 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.277731 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8dt82\"" Apr 21 15:35:40.278003 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.277986 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:35:40.278121 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.278034 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:35:40.431343 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.431308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7197e20-bb57-4167-b435-1446351d6727-hosts-file\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.431546 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.431348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7197e20-bb57-4167-b435-1446351d6727-tmp-dir\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.431546 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.431454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl6j9\" (UniqueName: \"kubernetes.io/projected/e7197e20-bb57-4167-b435-1446351d6727-kube-api-access-bl6j9\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.532774 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.532691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7197e20-bb57-4167-b435-1446351d6727-tmp-dir\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.532934 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.532826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl6j9\" (UniqueName: \"kubernetes.io/projected/e7197e20-bb57-4167-b435-1446351d6727-kube-api-access-bl6j9\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.532934 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.532872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7197e20-bb57-4167-b435-1446351d6727-hosts-file\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.533047 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.532960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7197e20-bb57-4167-b435-1446351d6727-hosts-file\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.533100 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.533065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7197e20-bb57-4167-b435-1446351d6727-tmp-dir\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.545700 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.545670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl6j9\" (UniqueName: \"kubernetes.io/projected/e7197e20-bb57-4167-b435-1446351d6727-kube-api-access-bl6j9\") pod \"node-resolver-wl2ck\" (UID: \"e7197e20-bb57-4167-b435-1446351d6727\") " pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.584679 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.584632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wl2ck" Apr 21 15:35:40.596122 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:40.596046 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7197e20_bb57_4167_b435_1446351d6727.slice/crio-21d36b44f5b5d184cad813c4180be5978e59b6db2b740cbee6f803b7dd943a93 WatchSource:0}: Error finding container 21d36b44f5b5d184cad813c4180be5978e59b6db2b740cbee6f803b7dd943a93: Status 404 returned error can't find the container with id 21d36b44f5b5d184cad813c4180be5978e59b6db2b740cbee6f803b7dd943a93 Apr 21 15:35:40.797896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.797777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:40.797896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.797814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:40.797896 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.797867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:40.798124 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:40.797954 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:40.798124 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:40.798048 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:40.798212 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:40.798142 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:40.978868 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.978826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wl2ck" event={"ID":"e7197e20-bb57-4167-b435-1446351d6727","Type":"ContainerStarted","Data":"fc3297ec9eceddf0657cb456e02aaf3aea7863d092f7b4fd93192cd36f9efcad"} Apr 21 15:35:40.979312 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.978878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wl2ck" event={"ID":"e7197e20-bb57-4167-b435-1446351d6727","Type":"ContainerStarted","Data":"21d36b44f5b5d184cad813c4180be5978e59b6db2b740cbee6f803b7dd943a93"} Apr 21 15:35:40.996132 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:40.996000 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wl2ck" podStartSLOduration=0.995982793 podStartE2EDuration="995.982793ms" podCreationTimestamp="2026-04-21 15:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:40.995706509 +0000 UTC m=+29.713783857" watchObservedRunningTime="2026-04-21 15:35:40.995982793 +0000 UTC m=+29.714060141" Apr 21 15:35:42.797618 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:42.797339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:42.798069 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:42.797350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:42.798069 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:42.797658 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t694t" podUID="04b20ea4-ea35-461d-8228-945315c5c4e9" Apr 21 15:35:42.798069 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:42.797752 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq5ln" podUID="8b870a2e-b786-497a-8ee3-57668a43f22d" Apr 21 15:35:42.798069 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:42.797371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:42.798069 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:42.797838 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6f2r4" podUID="3a193bd2-d4b3-409b-a943-668e1838d610" Apr 21 15:35:44.143143 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.143115 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-162.ec2.internal" event="NodeReady" Apr 21 15:35:44.143751 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.143280 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:35:44.186071 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.186033 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4"] Apr 21 15:35:44.225504 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.225447 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68568f786f-8pz2q"] Apr 21 15:35:44.225668 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.225633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.230836 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.230639 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 15:35:44.230836 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.230686 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.231081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.230986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9p5fm\"" Apr 21 15:35:44.231081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.231036 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.231256 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.231236 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 15:35:44.251662 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.251619 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dv5m4"] Apr 21 15:35:44.251840 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.251793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.255632 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.255025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:35:44.255632 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.255304 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:35:44.255632 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.255419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tfsv8\"" Apr 21 15:35:44.255632 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.255559 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:35:44.259720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.259695 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:35:44.267720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.267684 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8"] Apr 21 15:35:44.267854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.267842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.270331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.270308 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.270575 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.270561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-5xn9c\"" Apr 21 15:35:44.270841 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.270819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 15:35:44.270841 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.270837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 15:35:44.271046 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.270901 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.276208 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.276185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 15:35:44.290064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.290033 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt"] Apr 21 15:35:44.290245 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.290190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" Apr 21 15:35:44.293044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.293017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.293044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.293021 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.293245 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.293023 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-khvv5\"" Apr 21 15:35:44.311825 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.311793 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7"] Apr 21 15:35:44.312016 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.311972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:44.315044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.314985 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.315044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.314985 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.315044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.314987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-sfgvm\"" Apr 21 15:35:44.315431 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.315413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 15:35:44.327837 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.327810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt"] Apr 21 15:35:44.327837 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.327839 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4"] Apr 21 15:35:44.328002 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.327851 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68568f786f-8pz2q"] Apr 21 15:35:44.328002 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.327865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7"] Apr 21 15:35:44.328002 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.327878 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dv5m4"] Apr 21 15:35:44.328002 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.327890 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9wdl5"] Apr 21 15:35:44.328162 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.328001 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.331784 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.331740 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ph6gq\"" Apr 21 15:35:44.331916 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.331895 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 15:35:44.331999 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.331982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 15:35:44.332223 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.332204 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.332223 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.332210 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.351784 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.351752 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6"] Apr 21 15:35:44.352837 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.352795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.355716 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.355678 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8774b\"" Apr 21 15:35:44.355851 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.355739 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:35:44.355851 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.355777 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:35:44.365351 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e203aed0-40fa-4049-8152-8cb9d29fe09e-serving-cert\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.365470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzd4\" (UniqueName: \"kubernetes.io/projected/5e975175-9472-4f4d-ac64-96a287811fe5-kube-api-access-ctzd4\") pod \"volume-data-source-validator-7c6cbb6c87-926t8\" (UID: \"5e975175-9472-4f4d-ac64-96a287811fe5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" Apr 21 15:35:44.365470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-bound-sa-token\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-trusted-ca\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-image-registry-private-configuration\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203aed0-40fa-4049-8152-8cb9d29fe09e-config\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.365682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309490ab-206f-4ed9-9045-5effcdd68f2a-ca-trust-extracted\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-installation-pull-secrets\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/934d29fe-8f2c-43c4-850e-f630d78f8e46-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.365682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4flc\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-kube-api-access-x4flc\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934d29fe-8f2c-43c4-850e-f630d78f8e46-config\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.365843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365721 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw"] Apr 21 15:35:44.365843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nb9\" (UniqueName: \"kubernetes.io/projected/934d29fe-8f2c-43c4-850e-f630d78f8e46-kube-api-access-x8nb9\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.365843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-certificates\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.365843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e203aed0-40fa-4049-8152-8cb9d29fe09e-trusted-ca\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.366047 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z4sx\" (UniqueName: \"kubernetes.io/projected/e203aed0-40fa-4049-8152-8cb9d29fe09e-kube-api-access-7z4sx\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.366047 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.365956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:44.368759 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.368729 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 15:35:44.368759 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.368728 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 15:35:44.368935 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.368733 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-4l77t\"" Apr 21 15:35:44.380776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.380753 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n554q"] Apr 21 15:35:44.380885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.380846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" Apr 21 15:35:44.386820 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.386728 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.387064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.387036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-h2kg4\"" Apr 21 15:35:44.387389 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.387179 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.397269 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.397244 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8"] Apr 21 15:35:44.397396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.397276 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n554q"] Apr 21 15:35:44.397396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.397289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6"] Apr 21 15:35:44.397396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.397302 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9wdl5"] Apr 21 15:35:44.397396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.397314 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw"] Apr 21 15:35:44.397396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.397390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:44.400537 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.400509 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:35:44.400636 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.400616 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9cdd\"" Apr 21 15:35:44.400816 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.400787 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:35:44.400928 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.400899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:35:44.467270 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8784f5d-7f15-4691-ba7d-539cda706701-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:44.467458 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8287ddd5-c147-400c-b1e7-382801765df6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.467458 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7e9ca3a-1238-49eb-be83-c342ccbacce4-tmp-dir\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.467458 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934d29fe-8f2c-43c4-850e-f630d78f8e46-config\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.467458 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nb9\" (UniqueName: \"kubernetes.io/projected/934d29fe-8f2c-43c4-850e-f630d78f8e46-kube-api-access-x8nb9\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.467669 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z4sx\" (UniqueName: \"kubernetes.io/projected/e203aed0-40fa-4049-8152-8cb9d29fe09e-kube-api-access-7z4sx\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.467669 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-certificates\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.467669 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg5q\" (UniqueName: \"kubernetes.io/projected/e7e9ca3a-1238-49eb-be83-c342ccbacce4-kube-api-access-rqg5q\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.467669 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e203aed0-40fa-4049-8152-8cb9d29fe09e-serving-cert\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.467876 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:44.467876 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-trusted-ca\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.467876 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkk4\" (UniqueName: \"kubernetes.io/projected/8287ddd5-c147-400c-b1e7-382801765df6-kube-api-access-mbkk4\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.468033 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:44.468033 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934d29fe-8f2c-43c4-850e-f630d78f8e46-config\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.468033 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.467983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-image-registry-private-configuration\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468033 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-installation-pull-secrets\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468033 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8287ddd5-c147-400c-b1e7-382801765df6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.468261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlwt\" (UniqueName: \"kubernetes.io/projected/608e7afe-b62a-4ba0-b260-74afbb27a0f7-kube-api-access-6dlwt\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:44.468261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4flc\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-kube-api-access-x4flc\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-certificates\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e203aed0-40fa-4049-8152-8cb9d29fe09e-trusted-ca\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzd4\" (UniqueName: \"kubernetes.io/projected/5e975175-9472-4f4d-ac64-96a287811fe5-kube-api-access-ctzd4\") pod \"volume-data-source-validator-7c6cbb6c87-926t8\" (UID: \"5e975175-9472-4f4d-ac64-96a287811fe5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/006c10d9-f7f2-4576-8bf9-f0d0df3a923e-kube-api-access-ch5bd\") pod \"network-check-source-8894fc9bd-dzhlw\" (UID: \"006c10d9-f7f2-4576-8bf9-f0d0df3a923e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e9ca3a-1238-49eb-be83-c342ccbacce4-config-volume\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-bound-sa-token\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203aed0-40fa-4049-8152-8cb9d29fe09e-config\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309490ab-206f-4ed9-9045-5effcdd68f2a-ca-trust-extracted\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.468945 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/934d29fe-8f2c-43c4-850e-f630d78f8e46-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.468945 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.468426 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:44.468945 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.468721 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68568f786f-8pz2q: secret "image-registry-tls" not found Apr 21 15:35:44.468945 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.468791 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls podName:309490ab-206f-4ed9-9045-5effcdd68f2a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:44.968771227 +0000 UTC m=+33.686848574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls") pod "image-registry-68568f786f-8pz2q" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a") : secret "image-registry-tls" not found Apr 21 15:35:44.468945 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.468817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-trusted-ca\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.469286 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.469260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e203aed0-40fa-4049-8152-8cb9d29fe09e-trusted-ca\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.469352 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.469316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309490ab-206f-4ed9-9045-5effcdd68f2a-ca-trust-extracted\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.469618 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.469577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203aed0-40fa-4049-8152-8cb9d29fe09e-config\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.471918 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.471897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/934d29fe-8f2c-43c4-850e-f630d78f8e46-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.471918 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.471904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e203aed0-40fa-4049-8152-8cb9d29fe09e-serving-cert\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.472082 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.472062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-installation-pull-secrets\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.472953 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.472935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-image-registry-private-configuration\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.484353 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.484297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nb9\" (UniqueName: \"kubernetes.io/projected/934d29fe-8f2c-43c4-850e-f630d78f8e46-kube-api-access-x8nb9\") pod \"service-ca-operator-d6fc45fc5-4c4f4\" (UID: \"934d29fe-8f2c-43c4-850e-f630d78f8e46\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.486450 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.486427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z4sx\" (UniqueName: \"kubernetes.io/projected/e203aed0-40fa-4049-8152-8cb9d29fe09e-kube-api-access-7z4sx\") pod \"console-operator-9d4b6777b-dv5m4\" (UID: \"e203aed0-40fa-4049-8152-8cb9d29fe09e\") " pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.489183 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.489163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-bound-sa-token\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.490157 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.490131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzd4\" (UniqueName: \"kubernetes.io/projected/5e975175-9472-4f4d-ac64-96a287811fe5-kube-api-access-ctzd4\") pod \"volume-data-source-validator-7c6cbb6c87-926t8\" (UID: \"5e975175-9472-4f4d-ac64-96a287811fe5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" Apr 21 15:35:44.490424 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.490405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4flc\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-kube-api-access-x4flc\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.538393 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.538339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" Apr 21 15:35:44.569699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.569664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8784f5d-7f15-4691-ba7d-539cda706701-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:44.569699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.569713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:44.569940 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.569841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8287ddd5-c147-400c-b1e7-382801765df6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.569940 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.569894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7e9ca3a-1238-49eb-be83-c342ccbacce4-tmp-dir\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.570054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.569958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:44.570054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg5q\" (UniqueName: \"kubernetes.io/projected/e7e9ca3a-1238-49eb-be83-c342ccbacce4-kube-api-access-rqg5q\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.570054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkk4\" (UniqueName: \"kubernetes.io/projected/8287ddd5-c147-400c-b1e7-382801765df6-kube-api-access-mbkk4\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570125 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8287ddd5-c147-400c-b1e7-382801765df6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570191 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs podName:8b870a2e-b786-497a-8ee3-57668a43f22d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.570171774 +0000 UTC m=+65.288249117 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs") pod "network-metrics-daemon-sq5ln" (UID: "8b870a2e-b786-497a-8ee3-57668a43f22d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:44.570203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7e9ca3a-1238-49eb-be83-c342ccbacce4-tmp-dir\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.570454 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570300 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:35:44.570454 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570316 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 15:35:44.570454 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570344 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls podName:608e7afe-b62a-4ba0-b260-74afbb27a0f7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.070330742 +0000 UTC m=+33.788408070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5lpt" (UID: "608e7afe-b62a-4ba0-b260-74afbb27a0f7") : secret "samples-operator-tls" not found Apr 21 15:35:44.570454 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570405 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert podName:b8784f5d-7f15-4691-ba7d-539cda706701 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.070388472 +0000 UTC m=+33.788465823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ndqn6" (UID: "b8784f5d-7f15-4691-ba7d-539cda706701") : secret "networking-console-plugin-cert" not found Apr 21 15:35:44.570645 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8784f5d-7f15-4691-ba7d-539cda706701-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:44.570645 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8287ddd5-c147-400c-b1e7-382801765df6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.570645 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlwt\" (UniqueName: \"kubernetes.io/projected/608e7afe-b62a-4ba0-b260-74afbb27a0f7-kube-api-access-6dlwt\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:44.570797 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q9fx\" (UniqueName: \"kubernetes.io/projected/94cdcb75-41df-488a-9f65-1dcac041f00e-kube-api-access-4q9fx\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:44.570797 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.570797 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/006c10d9-f7f2-4576-8bf9-f0d0df3a923e-kube-api-access-ch5bd\") pod \"network-check-source-8894fc9bd-dzhlw\" (UID: \"006c10d9-f7f2-4576-8bf9-f0d0df3a923e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" Apr 21 15:35:44.570945 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570833 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:44.570945 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.570869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e9ca3a-1238-49eb-be83-c342ccbacce4-config-volume\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.570945 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.570892 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls podName:e7e9ca3a-1238-49eb-be83-c342ccbacce4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.070876886 +0000 UTC m=+33.788954230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls") pod "dns-default-9wdl5" (UID: "e7e9ca3a-1238-49eb-be83-c342ccbacce4") : secret "dns-default-metrics-tls" not found Apr 21 15:35:44.571333 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.571314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e9ca3a-1238-49eb-be83-c342ccbacce4-config-volume\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.573471 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.573449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlpn\" (UniqueName: \"kubernetes.io/projected/04b20ea4-ea35-461d-8228-945315c5c4e9-kube-api-access-cvlpn\") pod \"network-check-target-t694t\" (UID: \"04b20ea4-ea35-461d-8228-945315c5c4e9\") " pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:44.578195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.578173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:44.583286 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.583264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/006c10d9-f7f2-4576-8bf9-f0d0df3a923e-kube-api-access-ch5bd\") pod \"network-check-source-8894fc9bd-dzhlw\" (UID: \"006c10d9-f7f2-4576-8bf9-f0d0df3a923e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" Apr 21 15:35:44.583413 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.583354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg5q\" (UniqueName: \"kubernetes.io/projected/e7e9ca3a-1238-49eb-be83-c342ccbacce4-kube-api-access-rqg5q\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:44.583782 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.583752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8287ddd5-c147-400c-b1e7-382801765df6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.584719 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.584694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkk4\" (UniqueName: \"kubernetes.io/projected/8287ddd5-c147-400c-b1e7-382801765df6-kube-api-access-mbkk4\") pod \"kube-storage-version-migrator-operator-6769c5d45-fcfg7\" (UID: \"8287ddd5-c147-400c-b1e7-382801765df6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.584930 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.584908 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlwt\" (UniqueName: \"kubernetes.io/projected/608e7afe-b62a-4ba0-b260-74afbb27a0f7-kube-api-access-6dlwt\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:44.600549 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.600522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" Apr 21 15:35:44.640904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.640866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" Apr 21 15:35:44.672081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.672041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q9fx\" (UniqueName: \"kubernetes.io/projected/94cdcb75-41df-488a-9f65-1dcac041f00e-kube-api-access-4q9fx\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:44.672272 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.672123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:44.672272 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.672249 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:44.672389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.672322 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert podName:94cdcb75-41df-488a-9f65-1dcac041f00e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.172301696 +0000 UTC m=+33.890379034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert") pod "ingress-canary-n554q" (UID: "94cdcb75-41df-488a-9f65-1dcac041f00e") : secret "canary-serving-cert" not found Apr 21 15:35:44.683170 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.683143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q9fx\" (UniqueName: \"kubernetes.io/projected/94cdcb75-41df-488a-9f65-1dcac041f00e-kube-api-access-4q9fx\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:44.689836 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.689804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" Apr 21 15:35:44.797439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.797337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:44.797439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.797371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:44.797439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.797429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:35:44.800507 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.800454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:35:44.800651 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.800543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:35:44.800735 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.800688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dzs5m\"" Apr 21 15:35:44.801165 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.801139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l72g6\"" Apr 21 15:35:44.809759 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.809738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:44.975383 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:44.975341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:44.975607 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.975516 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:44.975607 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.975539 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68568f786f-8pz2q: secret "image-registry-tls" not found Apr 21 15:35:44.975607 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:44.975607 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls podName:309490ab-206f-4ed9-9045-5effcdd68f2a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.975586022 +0000 UTC m=+34.693663354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls") pod "image-registry-68568f786f-8pz2q" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a") : secret "image-registry-tls" not found Apr 21 15:35:45.079568 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.079407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:45.079568 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.079481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:45.079829 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.079594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:45.079829 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.079782 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:45.079916 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.079856 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls podName:e7e9ca3a-1238-49eb-be83-c342ccbacce4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:46.079834874 +0000 UTC m=+34.797912216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls") pod "dns-default-9wdl5" (UID: "e7e9ca3a-1238-49eb-be83-c342ccbacce4") : secret "dns-default-metrics-tls" not found Apr 21 15:35:45.080208 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.080098 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 15:35:45.080208 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.080188 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert podName:b8784f5d-7f15-4691-ba7d-539cda706701 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:46.080168433 +0000 UTC m=+34.798245777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ndqn6" (UID: "b8784f5d-7f15-4691-ba7d-539cda706701") : secret "networking-console-plugin-cert" not found Apr 21 15:35:45.080393 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.080282 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:35:45.080467 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.080441 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls podName:608e7afe-b62a-4ba0-b260-74afbb27a0f7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:46.080412749 +0000 UTC m=+34.798490084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5lpt" (UID: "608e7afe-b62a-4ba0-b260-74afbb27a0f7") : secret "samples-operator-tls" not found Apr 21 15:35:45.180794 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.180746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:45.181460 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.180909 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:45.181460 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.180981 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert podName:94cdcb75-41df-488a-9f65-1dcac041f00e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:46.180963171 +0000 UTC m=+34.899040509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert") pod "ingress-canary-n554q" (UID: "94cdcb75-41df-488a-9f65-1dcac041f00e") : secret "canary-serving-cert" not found Apr 21 15:35:45.893413 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.893211 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t694t"] Apr 21 15:35:45.894360 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.894325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8"] Apr 21 15:35:45.896327 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.896297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dv5m4"] Apr 21 15:35:45.901712 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:45.901684 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e975175_9472_4f4d_ac64_96a287811fe5.slice/crio-619462299d2a72f69e72120c7f31bfa9a55f43ddaf1274c925169784668e4907 WatchSource:0}: Error finding container 619462299d2a72f69e72120c7f31bfa9a55f43ddaf1274c925169784668e4907: Status 404 returned error can't find the container with id 619462299d2a72f69e72120c7f31bfa9a55f43ddaf1274c925169784668e4907 Apr 21 15:35:45.902587 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:45.902547 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode203aed0_40fa_4049_8152_8cb9d29fe09e.slice/crio-67308b3f7e6a265bbd6e86dd0c585a5d56c8db0b61e67614df1ef1ed4fd6ce46 WatchSource:0}: Error finding container 67308b3f7e6a265bbd6e86dd0c585a5d56c8db0b61e67614df1ef1ed4fd6ce46: Status 404 returned error can't find the container with id 67308b3f7e6a265bbd6e86dd0c585a5d56c8db0b61e67614df1ef1ed4fd6ce46 Apr 21 15:35:45.906656 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.906635 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7"] Apr 21 15:35:45.907786 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:45.907757 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8287ddd5_c147_400c_b1e7_382801765df6.slice/crio-db1000dc613c694930fa8b132aa49630a4f0bbc45c8430c68d980a835f42141e WatchSource:0}: Error finding container db1000dc613c694930fa8b132aa49630a4f0bbc45c8430c68d980a835f42141e: Status 404 returned error can't find the container with id db1000dc613c694930fa8b132aa49630a4f0bbc45c8430c68d980a835f42141e Apr 21 15:35:45.910726 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.910703 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw"] Apr 21 15:35:45.911583 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.911560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4"] Apr 21 15:35:45.912400 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:45.912374 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934d29fe_8f2c_43c4_850e_f630d78f8e46.slice/crio-e651146100745926d3315ac61c39104bec0ba463f262b5746c3b7fa42ca2586d WatchSource:0}: Error finding container e651146100745926d3315ac61c39104bec0ba463f262b5746c3b7fa42ca2586d: Status 404 returned error can't find the container with id e651146100745926d3315ac61c39104bec0ba463f262b5746c3b7fa42ca2586d Apr 21 15:35:45.913194 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:45.913174 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod006c10d9_f7f2_4576_8bf9_f0d0df3a923e.slice/crio-06303d4ade4fefdbfe45d6cfb96d791f4d3fb758c41a718c04d08a3953c8d5d7 WatchSource:0}: Error finding container 06303d4ade4fefdbfe45d6cfb96d791f4d3fb758c41a718c04d08a3953c8d5d7: Status 404 returned error can't find the container with id 06303d4ade4fefdbfe45d6cfb96d791f4d3fb758c41a718c04d08a3953c8d5d7 Apr 21 15:35:45.988520 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.988445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:45.988734 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.988628 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:45.988734 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.988644 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68568f786f-8pz2q: secret "image-registry-tls" not found Apr 21 15:35:45.988734 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:45.988702 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls podName:309490ab-206f-4ed9-9045-5effcdd68f2a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:47.988685585 +0000 UTC m=+36.706762910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls") pod "image-registry-68568f786f-8pz2q" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a") : secret "image-registry-tls" not found Apr 21 15:35:45.991310 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.991241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" event={"ID":"006c10d9-f7f2-4576-8bf9-f0d0df3a923e","Type":"ContainerStarted","Data":"06303d4ade4fefdbfe45d6cfb96d791f4d3fb758c41a718c04d08a3953c8d5d7"} Apr 21 15:35:45.992521 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.992457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" event={"ID":"5e975175-9472-4f4d-ac64-96a287811fe5","Type":"ContainerStarted","Data":"619462299d2a72f69e72120c7f31bfa9a55f43ddaf1274c925169784668e4907"} Apr 21 15:35:45.993784 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.993754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t694t" event={"ID":"04b20ea4-ea35-461d-8228-945315c5c4e9","Type":"ContainerStarted","Data":"c5e7810784d4d40b02addab018c2a7f314cae843c0eb64528b931a04e2afa09b"} Apr 21 15:35:45.994803 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.994777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" event={"ID":"e203aed0-40fa-4049-8152-8cb9d29fe09e","Type":"ContainerStarted","Data":"67308b3f7e6a265bbd6e86dd0c585a5d56c8db0b61e67614df1ef1ed4fd6ce46"} Apr 21 15:35:45.995816 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.995797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" event={"ID":"8287ddd5-c147-400c-b1e7-382801765df6","Type":"ContainerStarted","Data":"db1000dc613c694930fa8b132aa49630a4f0bbc45c8430c68d980a835f42141e"} Apr 21 15:35:45.997085 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:45.997063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" event={"ID":"934d29fe-8f2c-43c4-850e-f630d78f8e46","Type":"ContainerStarted","Data":"e651146100745926d3315ac61c39104bec0ba463f262b5746c3b7fa42ca2586d"} Apr 21 15:35:46.089266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:46.089240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:46.089338 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:46.089282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:46.089406 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:46.089339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:46.089406 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.089380 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:35:46.089549 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.089443 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls podName:608e7afe-b62a-4ba0-b260-74afbb27a0f7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:48.089426769 +0000 UTC m=+36.807504095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5lpt" (UID: "608e7afe-b62a-4ba0-b260-74afbb27a0f7") : secret "samples-operator-tls" not found Apr 21 15:35:46.089549 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.089452 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 15:35:46.089549 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.089476 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:46.089549 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.089545 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert podName:b8784f5d-7f15-4691-ba7d-539cda706701 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:48.089526386 +0000 UTC m=+36.807603725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ndqn6" (UID: "b8784f5d-7f15-4691-ba7d-539cda706701") : secret "networking-console-plugin-cert" not found Apr 21 15:35:46.089687 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.089567 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls podName:e7e9ca3a-1238-49eb-be83-c342ccbacce4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:48.089556644 +0000 UTC m=+36.807633973 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls") pod "dns-default-9wdl5" (UID: "e7e9ca3a-1238-49eb-be83-c342ccbacce4") : secret "dns-default-metrics-tls" not found Apr 21 15:35:46.190668 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:46.190634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:46.191089 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.190779 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:46.191089 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:46.190861 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert podName:94cdcb75-41df-488a-9f65-1dcac041f00e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:48.190832708 +0000 UTC m=+36.908910033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert") pod "ingress-canary-n554q" (UID: "94cdcb75-41df-488a-9f65-1dcac041f00e") : secret "canary-serving-cert" not found Apr 21 15:35:47.005869 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:47.005565 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb648a31-68d5-41f6-8194-806717864579" containerID="9b82b2fb02c5dd40523f42cfd764c2e0f19364a4b202a8dc0b3fff58649e6425" exitCode=0 Apr 21 15:35:47.005869 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:47.005646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerDied","Data":"9b82b2fb02c5dd40523f42cfd764c2e0f19364a4b202a8dc0b3fff58649e6425"} Apr 21 15:35:48.008503 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.008440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:48.009051 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.008683 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:48.009051 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.008703 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68568f786f-8pz2q: secret "image-registry-tls" not found Apr 21 15:35:48.009051 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.008762 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls podName:309490ab-206f-4ed9-9045-5effcdd68f2a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.008743472 +0000 UTC m=+40.726820808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls") pod "image-registry-68568f786f-8pz2q" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a") : secret "image-registry-tls" not found Apr 21 15:35:48.019941 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.019894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerDied","Data":"6acc2e989e7a52190d56057081600f35eebaf608c38f226480ba8f39a5c50aac"} Apr 21 15:35:48.020647 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.020611 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb648a31-68d5-41f6-8194-806717864579" containerID="6acc2e989e7a52190d56057081600f35eebaf608c38f226480ba8f39a5c50aac" exitCode=0 Apr 21 15:35:48.110085 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.109484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:48.110085 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.109579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:48.110085 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.109666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:48.110085 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.109814 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:48.110085 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.109881 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls podName:e7e9ca3a-1238-49eb-be83-c342ccbacce4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.109860394 +0000 UTC m=+40.827937724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls") pod "dns-default-9wdl5" (UID: "e7e9ca3a-1238-49eb-be83-c342ccbacce4") : secret "dns-default-metrics-tls" not found Apr 21 15:35:48.110519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.110192 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 15:35:48.110519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.110205 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:35:48.110519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.110261 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert podName:b8784f5d-7f15-4691-ba7d-539cda706701 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.110243232 +0000 UTC m=+40.828320563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ndqn6" (UID: "b8784f5d-7f15-4691-ba7d-539cda706701") : secret "networking-console-plugin-cert" not found Apr 21 15:35:48.110519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.110280 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls podName:608e7afe-b62a-4ba0-b260-74afbb27a0f7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.110270845 +0000 UTC m=+40.828348187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5lpt" (UID: "608e7afe-b62a-4ba0-b260-74afbb27a0f7") : secret "samples-operator-tls" not found Apr 21 15:35:48.211120 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:48.210818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:48.211340 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.211152 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:48.211340 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:48.211236 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert podName:94cdcb75-41df-488a-9f65-1dcac041f00e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.211203482 +0000 UTC m=+40.929280815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert") pod "ingress-canary-n554q" (UID: "94cdcb75-41df-488a-9f65-1dcac041f00e") : secret "canary-serving-cert" not found Apr 21 15:35:51.338663 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:51.338619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:51.342348 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:51.342326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a193bd2-d4b3-409b-a943-668e1838d610-original-pull-secret\") pod \"global-pull-secret-syncer-6f2r4\" (UID: \"3a193bd2-d4b3-409b-a943-668e1838d610\") " pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:51.425387 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:51.425351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6f2r4" Apr 21 15:35:51.886401 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:51.886377 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6f2r4"] Apr 21 15:35:51.889840 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:35:51.889809 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a193bd2_d4b3_409b_a943_668e1838d610.slice/crio-984944931f49ed847494cf21ef7e66a02bde6a9aec4ec32cb32394c09fc5df93 WatchSource:0}: Error finding container 984944931f49ed847494cf21ef7e66a02bde6a9aec4ec32cb32394c09fc5df93: Status 404 returned error can't find the container with id 984944931f49ed847494cf21ef7e66a02bde6a9aec4ec32cb32394c09fc5df93 Apr 21 15:35:52.036115 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.035724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/0.log" Apr 21 15:35:52.036115 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.035793 2576 generic.go:358] "Generic (PLEG): container finished" podID="e203aed0-40fa-4049-8152-8cb9d29fe09e" containerID="346de3aad237e8fb36e2062f14cc29c9f127a2dbe8ba7eb91d4719984414f4a7" exitCode=255 Apr 21 15:35:52.036115 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.035898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" event={"ID":"e203aed0-40fa-4049-8152-8cb9d29fe09e","Type":"ContainerDied","Data":"346de3aad237e8fb36e2062f14cc29c9f127a2dbe8ba7eb91d4719984414f4a7"} Apr 21 15:35:52.036356 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.036117 2576 scope.go:117] "RemoveContainer" containerID="346de3aad237e8fb36e2062f14cc29c9f127a2dbe8ba7eb91d4719984414f4a7" Apr 21 15:35:52.044792 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.044422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:35:52.044792 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.044565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" event={"ID":"eb648a31-68d5-41f6-8194-806717864579","Type":"ContainerStarted","Data":"39ac8ee6b02fbbf46510769b19f8fc80c7de7f6fb785afdc3e64d0ea63b3c72a"} Apr 21 15:35:52.044792 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.044689 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:52.044792 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.044703 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68568f786f-8pz2q: secret "image-registry-tls" not found Apr 21 15:35:52.044792 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.044756 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls podName:309490ab-206f-4ed9-9045-5effcdd68f2a nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.044736956 +0000 UTC m=+48.762814286 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls") pod "image-registry-68568f786f-8pz2q" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a") : secret "image-registry-tls" not found Apr 21 15:35:52.046140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.046110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6f2r4" event={"ID":"3a193bd2-d4b3-409b-a943-668e1838d610","Type":"ContainerStarted","Data":"984944931f49ed847494cf21ef7e66a02bde6a9aec4ec32cb32394c09fc5df93"} Apr 21 15:35:52.047703 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.047677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" event={"ID":"8287ddd5-c147-400c-b1e7-382801765df6","Type":"ContainerStarted","Data":"f782612981949c70105200660c637163eaf47a860051a09171e5b5842a742999"} Apr 21 15:35:52.050123 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.050094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" event={"ID":"934d29fe-8f2c-43c4-850e-f630d78f8e46","Type":"ContainerStarted","Data":"5ce09b69a0a622c437e4491e7360b6665a0c15e92bb0ebc1acbb811cb4000b76"} Apr 21 15:35:52.053347 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.053319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" event={"ID":"006c10d9-f7f2-4576-8bf9-f0d0df3a923e","Type":"ContainerStarted","Data":"8a78177e70135ffbc563e28cc63b354280249cf9bb06a2e7601df128409dc25d"} Apr 21 15:35:52.078142 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.078087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" podStartSLOduration=26.250781339 podStartE2EDuration="32.078066659s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="2026-04-21 15:35:45.91452252 +0000 UTC m=+34.632599849" lastFinishedPulling="2026-04-21 15:35:51.741807836 +0000 UTC m=+40.459885169" observedRunningTime="2026-04-21 15:35:52.077019975 +0000 UTC m=+40.795097315" watchObservedRunningTime="2026-04-21 15:35:52.078066659 +0000 UTC m=+40.796144008" Apr 21 15:35:52.101077 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.100947 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dzhlw" podStartSLOduration=12.274414682 podStartE2EDuration="18.100926465s" podCreationTimestamp="2026-04-21 15:35:34 +0000 UTC" firstStartedPulling="2026-04-21 15:35:45.91519623 +0000 UTC m=+34.633273573" lastFinishedPulling="2026-04-21 15:35:51.74170801 +0000 UTC m=+40.459785356" observedRunningTime="2026-04-21 15:35:52.10010402 +0000 UTC m=+40.818181375" watchObservedRunningTime="2026-04-21 15:35:52.100926465 +0000 UTC m=+40.819003814" Apr 21 15:35:52.136385 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.136309 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kmb6k" podStartSLOduration=8.278589872 podStartE2EDuration="41.136286955s" podCreationTimestamp="2026-04-21 15:35:11 +0000 UTC" firstStartedPulling="2026-04-21 15:35:13.059399442 +0000 UTC m=+1.777476768" lastFinishedPulling="2026-04-21 15:35:45.917096526 +0000 UTC m=+34.635173851" observedRunningTime="2026-04-21 15:35:52.13578357 +0000 UTC m=+40.853860919" watchObservedRunningTime="2026-04-21 15:35:52.136286955 +0000 UTC m=+40.854364305" Apr 21 15:35:52.146110 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.146076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:35:52.146235 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.146155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:35:52.146305 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.146289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:35:52.146915 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.146895 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:52.146979 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.146959 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls podName:e7e9ca3a-1238-49eb-be83-c342ccbacce4 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.146940637 +0000 UTC m=+48.865017973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls") pod "dns-default-9wdl5" (UID: "e7e9ca3a-1238-49eb-be83-c342ccbacce4") : secret "dns-default-metrics-tls" not found Apr 21 15:35:52.147320 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.147302 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:35:52.147382 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.147373 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls podName:608e7afe-b62a-4ba0-b260-74afbb27a0f7 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.147356191 +0000 UTC m=+48.865433525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5lpt" (UID: "608e7afe-b62a-4ba0-b260-74afbb27a0f7") : secret "samples-operator-tls" not found Apr 21 15:35:52.148073 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.148057 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 15:35:52.148129 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.148123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert podName:b8784f5d-7f15-4691-ba7d-539cda706701 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.148106828 +0000 UTC m=+48.866184157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ndqn6" (UID: "b8784f5d-7f15-4691-ba7d-539cda706701") : secret "networking-console-plugin-cert" not found Apr 21 15:35:52.247454 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:52.247419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:35:52.247665 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.247623 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:52.247727 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:52.247701 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert podName:94cdcb75-41df-488a-9f65-1dcac041f00e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.24767883 +0000 UTC m=+48.965756162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert") pod "ingress-canary-n554q" (UID: "94cdcb75-41df-488a-9f65-1dcac041f00e") : secret "canary-serving-cert" not found Apr 21 15:35:53.058704 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.058647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" event={"ID":"5e975175-9472-4f4d-ac64-96a287811fe5","Type":"ContainerStarted","Data":"710df473ad1c74b7998e808d75f01005d8caf6abf4dd9a9b2bd42ed048e1d494"} Apr 21 15:35:53.060153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.060112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t694t" event={"ID":"04b20ea4-ea35-461d-8228-945315c5c4e9","Type":"ContainerStarted","Data":"db37eb0b31f32c83ebfd518c47cf8981cbab4b61ef9eb7907ce3d59ab708944e"} Apr 21 15:35:53.060285 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.060257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:35:53.061684 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.061662 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:35:53.062064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.062046 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/0.log" Apr 21 15:35:53.062136 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.062086 2576 generic.go:358] "Generic (PLEG): container finished" podID="e203aed0-40fa-4049-8152-8cb9d29fe09e" containerID="1bab239590c6ac2dacd9fa673147c9129f35bd73a508440671ac21e96bd67ec8" exitCode=255 Apr 21 15:35:53.062925 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.062902 2576 scope.go:117] "RemoveContainer" containerID="1bab239590c6ac2dacd9fa673147c9129f35bd73a508440671ac21e96bd67ec8" Apr 21 15:35:53.063311 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:53.063066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dv5m4_openshift-console-operator(e203aed0-40fa-4049-8152-8cb9d29fe09e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" podUID="e203aed0-40fa-4049-8152-8cb9d29fe09e" Apr 21 15:35:53.063311 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.063137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" event={"ID":"e203aed0-40fa-4049-8152-8cb9d29fe09e","Type":"ContainerDied","Data":"1bab239590c6ac2dacd9fa673147c9129f35bd73a508440671ac21e96bd67ec8"} Apr 21 15:35:53.063311 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.063187 2576 scope.go:117] "RemoveContainer" containerID="346de3aad237e8fb36e2062f14cc29c9f127a2dbe8ba7eb91d4719984414f4a7" Apr 21 15:35:53.078448 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.078316 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-926t8" podStartSLOduration=27.23997321 podStartE2EDuration="33.078299597s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="2026-04-21 15:35:45.903774691 +0000 UTC m=+34.621852021" lastFinishedPulling="2026-04-21 15:35:51.742101079 +0000 UTC m=+40.460178408" observedRunningTime="2026-04-21 15:35:53.077263332 +0000 UTC m=+41.795340681" watchObservedRunningTime="2026-04-21 15:35:53.078299597 +0000 UTC m=+41.796376947" Apr 21 15:35:53.078900 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.078682 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" podStartSLOduration=27.24574983 podStartE2EDuration="33.078674899s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="2026-04-21 15:35:45.910174943 +0000 UTC m=+34.628252277" lastFinishedPulling="2026-04-21 15:35:51.743100019 +0000 UTC m=+40.461177346" observedRunningTime="2026-04-21 15:35:52.154941891 +0000 UTC m=+40.873019235" watchObservedRunningTime="2026-04-21 15:35:53.078674899 +0000 UTC m=+41.796752248" Apr 21 15:35:53.117008 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:53.116949 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t694t" podStartSLOduration=36.063650482 podStartE2EDuration="42.116927884s" podCreationTimestamp="2026-04-21 15:35:11 +0000 UTC" firstStartedPulling="2026-04-21 15:35:45.897659039 +0000 UTC m=+34.615736366" lastFinishedPulling="2026-04-21 15:35:51.950936427 +0000 UTC m=+40.669013768" observedRunningTime="2026-04-21 15:35:53.115042589 +0000 UTC m=+41.833119939" watchObservedRunningTime="2026-04-21 15:35:53.116927884 +0000 UTC m=+41.835005233" Apr 21 15:35:54.066993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:54.066962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:35:54.067512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:54.067368 2576 scope.go:117] "RemoveContainer" containerID="1bab239590c6ac2dacd9fa673147c9129f35bd73a508440671ac21e96bd67ec8" Apr 21 15:35:54.067663 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:54.067632 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dv5m4_openshift-console-operator(e203aed0-40fa-4049-8152-8cb9d29fe09e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" podUID="e203aed0-40fa-4049-8152-8cb9d29fe09e" Apr 21 15:35:54.578852 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:54.578801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:54.578852 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:54.578851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:35:55.072886 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:55.072846 2576 scope.go:117] "RemoveContainer" containerID="1bab239590c6ac2dacd9fa673147c9129f35bd73a508440671ac21e96bd67ec8" Apr 21 15:35:55.073350 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:35:55.073138 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dv5m4_openshift-console-operator(e203aed0-40fa-4049-8152-8cb9d29fe09e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" podUID="e203aed0-40fa-4049-8152-8cb9d29fe09e" Apr 21 15:35:57.079347 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:57.079300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6f2r4" event={"ID":"3a193bd2-d4b3-409b-a943-668e1838d610","Type":"ContainerStarted","Data":"ae22814df8d5e123756a4c8cfc2e3419623c3aeb03b8fcc34dfd09ca87745153"} Apr 21 15:35:57.123865 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:35:57.123812 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6f2r4" podStartSLOduration=33.983314961 podStartE2EDuration="38.123796753s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:51.917281209 +0000 UTC m=+40.635358535" lastFinishedPulling="2026-04-21 15:35:56.057762999 +0000 UTC m=+44.775840327" observedRunningTime="2026-04-21 15:35:57.123564412 +0000 UTC m=+45.841641771" watchObservedRunningTime="2026-04-21 15:35:57.123796753 +0000 UTC m=+45.841874101" Apr 21 15:36:00.115070 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:00.115028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:36:00.115519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.115175 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:36:00.115519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.115198 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68568f786f-8pz2q: secret "image-registry-tls" not found Apr 21 15:36:00.115519 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.115258 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls podName:309490ab-206f-4ed9-9045-5effcdd68f2a nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.115241699 +0000 UTC m=+64.833319024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls") pod "image-registry-68568f786f-8pz2q" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a") : secret "image-registry-tls" not found Apr 21 15:36:00.215776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:00.215742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:36:00.215947 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:00.215808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:36:00.215947 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.215900 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:00.215947 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.215908 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 15:36:00.216073 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.215955 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls podName:e7e9ca3a-1238-49eb-be83-c342ccbacce4 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.215942576 +0000 UTC m=+64.934019902 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls") pod "dns-default-9wdl5" (UID: "e7e9ca3a-1238-49eb-be83-c342ccbacce4") : secret "dns-default-metrics-tls" not found Apr 21 15:36:00.216073 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:00.215999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:36:00.216073 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.216041 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert podName:b8784f5d-7f15-4691-ba7d-539cda706701 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.216019801 +0000 UTC m=+64.934097127 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ndqn6" (UID: "b8784f5d-7f15-4691-ba7d-539cda706701") : secret "networking-console-plugin-cert" not found Apr 21 15:36:00.216073 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.216048 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:36:00.216073 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.216079 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls podName:608e7afe-b62a-4ba0-b260-74afbb27a0f7 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.216072607 +0000 UTC m=+64.934149933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5lpt" (UID: "608e7afe-b62a-4ba0-b260-74afbb27a0f7") : secret "samples-operator-tls" not found Apr 21 15:36:00.316720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:00.316677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:36:00.316876 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.316830 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:00.316912 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:00.316893 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert podName:94cdcb75-41df-488a-9f65-1dcac041f00e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:16.316877945 +0000 UTC m=+65.034955270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert") pod "ingress-canary-n554q" (UID: "94cdcb75-41df-488a-9f65-1dcac041f00e") : secret "canary-serving-cert" not found Apr 21 15:36:06.797867 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:06.797838 2576 scope.go:117] "RemoveContainer" containerID="1bab239590c6ac2dacd9fa673147c9129f35bd73a508440671ac21e96bd67ec8" Apr 21 15:36:07.106581 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:07.106553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:36:07.106744 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:07.106603 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" event={"ID":"e203aed0-40fa-4049-8152-8cb9d29fe09e","Type":"ContainerStarted","Data":"73dfbe6ed49be2c0af4f194e597431479626181cc4f82db78df9b81c568fbbd8"} Apr 21 15:36:07.106921 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:07.106889 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:36:07.134034 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:07.130925 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" podStartSLOduration=41.294463251 podStartE2EDuration="47.130898068s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="2026-04-21 15:35:45.905280713 +0000 UTC m=+34.623358039" lastFinishedPulling="2026-04-21 15:35:51.741715526 +0000 UTC m=+40.459792856" observedRunningTime="2026-04-21 15:36:07.129485267 +0000 UTC m=+55.847562614" watchObservedRunningTime="2026-04-21 15:36:07.130898068 +0000 UTC m=+55.848975417" Apr 21 15:36:07.357655 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:07.357567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-dv5m4" Apr 21 15:36:10.990193 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:10.990163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2fxv" Apr 21 15:36:16.145594 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.145550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:36:16.148017 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.147977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"image-registry-68568f786f-8pz2q\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:36:16.246771 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.246723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:36:16.246947 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.246784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:36:16.246947 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.246843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:36:16.249261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.249223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8784f5d-7f15-4691-ba7d-539cda706701-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ndqn6\" (UID: \"b8784f5d-7f15-4691-ba7d-539cda706701\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:36:16.249388 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.249260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7e9ca3a-1238-49eb-be83-c342ccbacce4-metrics-tls\") pod \"dns-default-9wdl5\" (UID: \"e7e9ca3a-1238-49eb-be83-c342ccbacce4\") " pod="openshift-dns/dns-default-9wdl5" Apr 21 15:36:16.249388 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.249345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/608e7afe-b62a-4ba0-b260-74afbb27a0f7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5lpt\" (UID: \"608e7afe-b62a-4ba0-b260-74afbb27a0f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:36:16.347733 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.347692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:36:16.350218 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.350185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94cdcb75-41df-488a-9f65-1dcac041f00e-cert\") pod \"ingress-canary-n554q\" (UID: \"94cdcb75-41df-488a-9f65-1dcac041f00e\") " pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:36:16.367346 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.367319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tfsv8\"" Apr 21 15:36:16.374697 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.374669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:36:16.426390 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.426362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-sfgvm\"" Apr 21 15:36:16.433596 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.433563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" Apr 21 15:36:16.467628 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.467439 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8774b\"" Apr 21 15:36:16.474782 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.474734 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9wdl5" Apr 21 15:36:16.478943 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.478898 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-4l77t\"" Apr 21 15:36:16.487756 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.487157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" Apr 21 15:36:16.511337 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.511301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68568f786f-8pz2q"] Apr 21 15:36:16.511539 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.511447 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9cdd\"" Apr 21 15:36:16.516382 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:16.516352 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309490ab_206f_4ed9_9045_5effcdd68f2a.slice/crio-08d89baabbca8b9f98730d833b057807050530b67ff5cc8f8d2e61f71267efc8 WatchSource:0}: Error finding container 08d89baabbca8b9f98730d833b057807050530b67ff5cc8f8d2e61f71267efc8: Status 404 returned error can't find the container with id 08d89baabbca8b9f98730d833b057807050530b67ff5cc8f8d2e61f71267efc8 Apr 21 15:36:16.517426 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.517395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n554q" Apr 21 15:36:16.599173 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.599132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt"] Apr 21 15:36:16.650556 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.649780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:36:16.652270 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.652189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6"] Apr 21 15:36:16.652588 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.652569 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:16.655157 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:16.655128 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8784f5d_7f15_4691_ba7d_539cda706701.slice/crio-923ed5ceedfa218dd9b400feb753e0735b1ccabafc5323910884d5a0251f182c WatchSource:0}: Error finding container 923ed5ceedfa218dd9b400feb753e0735b1ccabafc5323910884d5a0251f182c: Status 404 returned error can't find the container with id 923ed5ceedfa218dd9b400feb753e0735b1ccabafc5323910884d5a0251f182c Apr 21 15:36:16.664144 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.664118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b870a2e-b786-497a-8ee3-57668a43f22d-metrics-certs\") pod \"network-metrics-daemon-sq5ln\" (UID: \"8b870a2e-b786-497a-8ee3-57668a43f22d\") " pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:36:16.667996 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.667971 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9wdl5"] Apr 21 15:36:16.675525 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:16.675436 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e9ca3a_1238_49eb_be83_c342ccbacce4.slice/crio-ccc086c5f294ebe47eb4f19ac7f4fab0d0eec8bf3a0d9f245a5c65d98c3448e5 WatchSource:0}: Error finding container ccc086c5f294ebe47eb4f19ac7f4fab0d0eec8bf3a0d9f245a5c65d98c3448e5: Status 404 returned error can't find the container with id ccc086c5f294ebe47eb4f19ac7f4fab0d0eec8bf3a0d9f245a5c65d98c3448e5 Apr 21 15:36:16.693124 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.693098 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n554q"] Apr 21 15:36:16.696172 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:16.696146 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94cdcb75_41df_488a_9f65_1dcac041f00e.slice/crio-d63d97ee65beeb4c879148623a0bebc9595640adc26aeab056c4b0607c32b445 WatchSource:0}: Error finding container d63d97ee65beeb4c879148623a0bebc9595640adc26aeab056c4b0607c32b445: Status 404 returned error can't find the container with id d63d97ee65beeb4c879148623a0bebc9595640adc26aeab056c4b0607c32b445 Apr 21 15:36:16.921364 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.921330 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dzs5m\"" Apr 21 15:36:16.929005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:16.928943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq5ln" Apr 21 15:36:17.075446 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.075412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sq5ln"] Apr 21 15:36:17.078595 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:17.078565 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b870a2e_b786_497a_8ee3_57668a43f22d.slice/crio-c5bd1e4680296409420afd91706ec7c665ffe67716fd22ec27b412003a8f3977 WatchSource:0}: Error finding container c5bd1e4680296409420afd91706ec7c665ffe67716fd22ec27b412003a8f3977: Status 404 returned error can't find the container with id c5bd1e4680296409420afd91706ec7c665ffe67716fd22ec27b412003a8f3977 Apr 21 15:36:17.132580 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.132524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n554q" event={"ID":"94cdcb75-41df-488a-9f65-1dcac041f00e","Type":"ContainerStarted","Data":"d63d97ee65beeb4c879148623a0bebc9595640adc26aeab056c4b0607c32b445"} Apr 21 15:36:17.134008 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.133977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" event={"ID":"608e7afe-b62a-4ba0-b260-74afbb27a0f7","Type":"ContainerStarted","Data":"a50fca4e76c548eb08281f84485d9b64db0331507c6f639f5c3e6095a98618bc"} Apr 21 15:36:17.135247 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.135178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq5ln" event={"ID":"8b870a2e-b786-497a-8ee3-57668a43f22d","Type":"ContainerStarted","Data":"c5bd1e4680296409420afd91706ec7c665ffe67716fd22ec27b412003a8f3977"} Apr 21 15:36:17.137007 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.136969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" event={"ID":"b8784f5d-7f15-4691-ba7d-539cda706701","Type":"ContainerStarted","Data":"923ed5ceedfa218dd9b400feb753e0735b1ccabafc5323910884d5a0251f182c"} Apr 21 15:36:17.140206 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.140151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" event={"ID":"309490ab-206f-4ed9-9045-5effcdd68f2a","Type":"ContainerStarted","Data":"9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621"} Apr 21 15:36:17.140206 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.140186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" event={"ID":"309490ab-206f-4ed9-9045-5effcdd68f2a","Type":"ContainerStarted","Data":"08d89baabbca8b9f98730d833b057807050530b67ff5cc8f8d2e61f71267efc8"} Apr 21 15:36:17.140587 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.140554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:36:17.141685 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.141615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9wdl5" event={"ID":"e7e9ca3a-1238-49eb-be83-c342ccbacce4","Type":"ContainerStarted","Data":"ccc086c5f294ebe47eb4f19ac7f4fab0d0eec8bf3a0d9f245a5c65d98c3448e5"} Apr 21 15:36:17.165316 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:17.164942 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" podStartSLOduration=65.164927655 podStartE2EDuration="1m5.164927655s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:36:17.164103013 +0000 UTC m=+65.882180363" watchObservedRunningTime="2026-04-21 15:36:17.164927655 +0000 UTC m=+65.883005004" Apr 21 15:36:19.617806 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.617750 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-75lvm"] Apr 21 15:36:19.647345 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.646129 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57"] Apr 21 15:36:19.647345 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.646284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.649218 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.648989 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 15:36:19.649389 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.649220 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:36:19.649389 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.649279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:36:19.650374 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.650339 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-dzkls\"" Apr 21 15:36:19.650512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.650414 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 15:36:19.654923 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.654897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 15:36:19.663763 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.663739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57"] Apr 21 15:36:19.663893 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.663874 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-75lvm"] Apr 21 15:36:19.663958 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.663886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.668037 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.667978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:36:19.668422 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.668245 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-qg7r4\"" Apr 21 15:36:19.670150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.669150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:36:19.670150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.669781 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 15:36:19.670150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.669782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 15:36:19.776000 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.775963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-tmp\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtkgg\" (UniqueName: \"kubernetes.io/projected/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-kube-api-access-wtkgg\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f63638af-e18a-4727-b301-3061e1e187b2-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f63638af-e18a-4727-b301-3061e1e187b2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-snapshots\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mxr6\" (UniqueName: \"kubernetes.io/projected/f63638af-e18a-4727-b301-3061e1e187b2-kube-api-access-6mxr6\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-service-ca-bundle\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.776283 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.776264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-serving-cert\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.876942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.876977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-serving-cert\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-tmp\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkgg\" (UniqueName: \"kubernetes.io/projected/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-kube-api-access-wtkgg\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f63638af-e18a-4727-b301-3061e1e187b2-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.877210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f63638af-e18a-4727-b301-3061e1e187b2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.877210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-snapshots\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877558 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-tmp\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mxr6\" (UniqueName: \"kubernetes.io/projected/f63638af-e18a-4727-b301-3061e1e187b2-kube-api-access-6mxr6\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.877775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-service-ca-bundle\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.877935 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.877814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-snapshots\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.878072 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.878052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f63638af-e18a-4727-b301-3061e1e187b2-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.878138 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.878096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.878214 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.878197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-service-ca-bundle\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.879922 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.879892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-serving-cert\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.879922 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.879895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f63638af-e18a-4727-b301-3061e1e187b2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.889841 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.889809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtkgg\" (UniqueName: \"kubernetes.io/projected/2f94a9e6-1b72-42ce-8590-58a4cdf199ab-kube-api-access-wtkgg\") pod \"insights-operator-585dfdc468-75lvm\" (UID: \"2f94a9e6-1b72-42ce-8590-58a4cdf199ab\") " pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.889970 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.889810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mxr6\" (UniqueName: \"kubernetes.io/projected/f63638af-e18a-4727-b301-3061e1e187b2-kube-api-access-6mxr6\") pod \"cluster-monitoring-operator-75587bd455-ttq57\" (UID: \"f63638af-e18a-4727-b301-3061e1e187b2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:19.958722 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.958688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-75lvm" Apr 21 15:36:19.975510 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:19.975464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" Apr 21 15:36:20.983463 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:20.982425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57"] Apr 21 15:36:21.032128 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.032079 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-75lvm"] Apr 21 15:36:21.037094 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:21.037062 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f94a9e6_1b72_42ce_8590_58a4cdf199ab.slice/crio-84e4dd3ec45df824df630a18587e41747ebdf251b5a15807fd67d89070308302 WatchSource:0}: Error finding container 84e4dd3ec45df824df630a18587e41747ebdf251b5a15807fd67d89070308302: Status 404 returned error can't find the container with id 84e4dd3ec45df824df630a18587e41747ebdf251b5a15807fd67d89070308302 Apr 21 15:36:21.159886 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.159843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq5ln" event={"ID":"8b870a2e-b786-497a-8ee3-57668a43f22d","Type":"ContainerStarted","Data":"7607db6efdcce71823626ab7de3931e6fcc1bd2664e0fd19507e6a3996952943"} Apr 21 15:36:21.161389 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.161357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" event={"ID":"b8784f5d-7f15-4691-ba7d-539cda706701","Type":"ContainerStarted","Data":"54eb02bb6aae420bae45c15ce8ca7337e174e393200e88e5e5fa63f231e5e928"} Apr 21 15:36:21.170094 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.170034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9wdl5" event={"ID":"e7e9ca3a-1238-49eb-be83-c342ccbacce4","Type":"ContainerStarted","Data":"77e1ffaf9497e0fbc8e458436b1f514131247f51b1be88130b6c976a6d62d5af"} Apr 21 15:36:21.173143 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.173115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-75lvm" event={"ID":"2f94a9e6-1b72-42ce-8590-58a4cdf199ab","Type":"ContainerStarted","Data":"84e4dd3ec45df824df630a18587e41747ebdf251b5a15807fd67d89070308302"} Apr 21 15:36:21.175350 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.175323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n554q" event={"ID":"94cdcb75-41df-488a-9f65-1dcac041f00e","Type":"ContainerStarted","Data":"888bdeff86635c4037ce358a17a1a97320f525703e0846f25bf0045ff33cfcbc"} Apr 21 15:36:21.183566 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.183522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" event={"ID":"608e7afe-b62a-4ba0-b260-74afbb27a0f7","Type":"ContainerStarted","Data":"f57a5d6f36a3d67f618603d1b6f514126fdc7a935ef4f22203d924839f29a204"} Apr 21 15:36:21.188663 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.188604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" event={"ID":"f63638af-e18a-4727-b301-3061e1e187b2","Type":"ContainerStarted","Data":"bbce7d2882aed7534fd70d498a14b06f30dc0eac484afb467d8642be357f4bfa"} Apr 21 15:36:21.196418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.196363 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ndqn6" podStartSLOduration=39.056943758 podStartE2EDuration="43.19634761s" podCreationTimestamp="2026-04-21 15:35:38 +0000 UTC" firstStartedPulling="2026-04-21 15:36:16.657321784 +0000 UTC m=+65.375399115" lastFinishedPulling="2026-04-21 15:36:20.796725629 +0000 UTC m=+69.514802967" observedRunningTime="2026-04-21 15:36:21.194410968 +0000 UTC m=+69.912488331" watchObservedRunningTime="2026-04-21 15:36:21.19634761 +0000 UTC m=+69.914424961" Apr 21 15:36:21.215723 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:21.215659 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n554q" podStartSLOduration=33.11616129 podStartE2EDuration="37.215638878s" podCreationTimestamp="2026-04-21 15:35:44 +0000 UTC" firstStartedPulling="2026-04-21 15:36:16.697952227 +0000 UTC m=+65.416029554" lastFinishedPulling="2026-04-21 15:36:20.79742981 +0000 UTC m=+69.515507142" observedRunningTime="2026-04-21 15:36:21.215322228 +0000 UTC m=+69.933399577" watchObservedRunningTime="2026-04-21 15:36:21.215638878 +0000 UTC m=+69.933716227" Apr 21 15:36:22.195724 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:22.195161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" event={"ID":"608e7afe-b62a-4ba0-b260-74afbb27a0f7","Type":"ContainerStarted","Data":"4298cb64aab288d55856283dbd3a7c8be83e3e3f06e39821fd884fe38fa51b12"} Apr 21 15:36:22.198168 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:22.198118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq5ln" event={"ID":"8b870a2e-b786-497a-8ee3-57668a43f22d","Type":"ContainerStarted","Data":"90e92e732de69c449fa922df57fa95c3eeecc530e82dc6ae9ac822a178f1b6a4"} Apr 21 15:36:22.200133 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:22.200087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9wdl5" event={"ID":"e7e9ca3a-1238-49eb-be83-c342ccbacce4","Type":"ContainerStarted","Data":"3a43058bb18e3c392795948620c3a5dced441d1a1f2b9f9a47ff40ff86eb4bcc"} Apr 21 15:36:22.214548 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:22.214470 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5lpt" podStartSLOduration=58.070562152 podStartE2EDuration="1m2.214450617s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="2026-04-21 15:36:16.654743744 +0000 UTC m=+65.372821069" lastFinishedPulling="2026-04-21 15:36:20.79863219 +0000 UTC m=+69.516709534" observedRunningTime="2026-04-21 15:36:22.214119837 +0000 UTC m=+70.932197185" watchObservedRunningTime="2026-04-21 15:36:22.214450617 +0000 UTC m=+70.932527965" Apr 21 15:36:22.232006 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:22.231950 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sq5ln" podStartSLOduration=66.515971918 podStartE2EDuration="1m10.231933433s" podCreationTimestamp="2026-04-21 15:35:12 +0000 UTC" firstStartedPulling="2026-04-21 15:36:17.081018064 +0000 UTC m=+65.799095391" lastFinishedPulling="2026-04-21 15:36:20.796979565 +0000 UTC m=+69.515056906" observedRunningTime="2026-04-21 15:36:22.230775549 +0000 UTC m=+70.948852898" watchObservedRunningTime="2026-04-21 15:36:22.231933433 +0000 UTC m=+70.950010781" Apr 21 15:36:22.252682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:22.252619 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9wdl5" podStartSLOduration=34.133538804 podStartE2EDuration="38.252600181s" podCreationTimestamp="2026-04-21 15:35:44 +0000 UTC" firstStartedPulling="2026-04-21 15:36:16.677620122 +0000 UTC m=+65.395697452" lastFinishedPulling="2026-04-21 15:36:20.796681492 +0000 UTC m=+69.514758829" observedRunningTime="2026-04-21 15:36:22.252409094 +0000 UTC m=+70.970486442" watchObservedRunningTime="2026-04-21 15:36:22.252600181 +0000 UTC m=+70.970677530" Apr 21 15:36:23.203875 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.203836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9wdl5" Apr 21 15:36:23.907791 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.907756 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7578f8dd75-wkzk9"] Apr 21 15:36:23.937126 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.937092 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7578f8dd75-wkzk9"] Apr 21 15:36:23.937370 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.937197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:23.940681 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.940660 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 15:36:23.940802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.940664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 15:36:23.940802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.940664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 15:36:23.941093 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.941074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-46h4z\"" Apr 21 15:36:23.941947 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.941743 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 15:36:23.942080 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.942062 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 15:36:23.942228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.942211 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 15:36:23.942534 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:23.942517 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 15:36:24.019429 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.019329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-oauth-config\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.019429 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.019385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-console-config\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.019429 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.019410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4xk4\" (UniqueName: \"kubernetes.io/projected/be214c51-9db0-4029-aed5-20b000e89c05-kube-api-access-x4xk4\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.019684 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.019480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-oauth-serving-cert\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.019684 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.019583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-service-ca\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.019684 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.019635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-serving-cert\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.070941 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.070910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t694t" Apr 21 15:36:24.120972 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.120933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-serving-cert\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.121299 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.121273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-oauth-config\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.121464 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.121346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-console-config\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.121464 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.121374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4xk4\" (UniqueName: \"kubernetes.io/projected/be214c51-9db0-4029-aed5-20b000e89c05-kube-api-access-x4xk4\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.121464 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.121414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-oauth-serving-cert\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.121661 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.121475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-service-ca\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.122404 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.122075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-console-config\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.122404 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.122176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-service-ca\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.122773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.122746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-oauth-serving-cert\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.123844 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.123800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-serving-cert\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.124140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.124099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-oauth-config\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.134504 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.134463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4xk4\" (UniqueName: \"kubernetes.io/projected/be214c51-9db0-4029-aed5-20b000e89c05-kube-api-access-x4xk4\") pod \"console-7578f8dd75-wkzk9\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.207878 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.207835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-75lvm" event={"ID":"2f94a9e6-1b72-42ce-8590-58a4cdf199ab","Type":"ContainerStarted","Data":"2945db969d3efd4deea40dc2b949c03ab2a42461129960e64de6d1e03f28dcfb"} Apr 21 15:36:24.209287 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.209248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" event={"ID":"f63638af-e18a-4727-b301-3061e1e187b2","Type":"ContainerStarted","Data":"2aa9b02a24e4a36bd0f92fe5cc818ed91bbde5446ff8024cd8131be6a604b4d4"} Apr 21 15:36:24.228311 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.228249 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-75lvm" podStartSLOduration=2.499763934 podStartE2EDuration="5.228230806s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:21.039359287 +0000 UTC m=+69.757436613" lastFinishedPulling="2026-04-21 15:36:23.767826159 +0000 UTC m=+72.485903485" observedRunningTime="2026-04-21 15:36:24.226797819 +0000 UTC m=+72.944875168" watchObservedRunningTime="2026-04-21 15:36:24.228230806 +0000 UTC m=+72.946308155" Apr 21 15:36:24.247778 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.247744 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:24.253685 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.253632 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ttq57" podStartSLOduration=2.493787029 podStartE2EDuration="5.253615383s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:21.002837653 +0000 UTC m=+69.720914994" lastFinishedPulling="2026-04-21 15:36:23.762666022 +0000 UTC m=+72.480743348" observedRunningTime="2026-04-21 15:36:24.251452601 +0000 UTC m=+72.969529949" watchObservedRunningTime="2026-04-21 15:36:24.253615383 +0000 UTC m=+72.971692733" Apr 21 15:36:24.392927 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.392893 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7578f8dd75-wkzk9"] Apr 21 15:36:24.396211 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:24.396181 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe214c51_9db0_4029_aed5_20b000e89c05.slice/crio-62c2cce603f4b3d61317a9bef5ef426a0442a0fa9050429713e08761d5b011c9 WatchSource:0}: Error finding container 62c2cce603f4b3d61317a9bef5ef426a0442a0fa9050429713e08761d5b011c9: Status 404 returned error can't find the container with id 62c2cce603f4b3d61317a9bef5ef426a0442a0fa9050429713e08761d5b011c9 Apr 21 15:36:24.399726 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.399702 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk"] Apr 21 15:36:24.404274 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.404254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:24.406880 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.406855 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 15:36:24.406993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.406919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-zvmnw\"" Apr 21 15:36:24.415073 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.415052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk"] Apr 21 15:36:24.525385 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.525346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a1c5802a-6473-41b2-bc16-71d347b210f5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9ntvk\" (UID: \"a1c5802a-6473-41b2-bc16-71d347b210f5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:24.626842 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.626740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a1c5802a-6473-41b2-bc16-71d347b210f5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9ntvk\" (UID: \"a1c5802a-6473-41b2-bc16-71d347b210f5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:24.630148 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.630110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a1c5802a-6473-41b2-bc16-71d347b210f5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9ntvk\" (UID: \"a1c5802a-6473-41b2-bc16-71d347b210f5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:24.713931 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.713880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:24.848210 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:24.847801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk"] Apr 21 15:36:24.850703 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:24.850675 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c5802a_6473_41b2_bc16_71d347b210f5.slice/crio-e91efd5bcd6d860f7fb3bee0e49d3288c47c536218d8771d97b9f29386b57be7 WatchSource:0}: Error finding container e91efd5bcd6d860f7fb3bee0e49d3288c47c536218d8771d97b9f29386b57be7: Status 404 returned error can't find the container with id e91efd5bcd6d860f7fb3bee0e49d3288c47c536218d8771d97b9f29386b57be7 Apr 21 15:36:25.213767 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:25.213721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7578f8dd75-wkzk9" event={"ID":"be214c51-9db0-4029-aed5-20b000e89c05","Type":"ContainerStarted","Data":"62c2cce603f4b3d61317a9bef5ef426a0442a0fa9050429713e08761d5b011c9"} Apr 21 15:36:25.214773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:25.214744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" event={"ID":"a1c5802a-6473-41b2-bc16-71d347b210f5","Type":"ContainerStarted","Data":"e91efd5bcd6d860f7fb3bee0e49d3288c47c536218d8771d97b9f29386b57be7"} Apr 21 15:36:25.548780 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:25.548673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9wdl5_e7e9ca3a-1238-49eb-be83-c342ccbacce4/dns/0.log" Apr 21 15:36:25.735433 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:25.735404 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9wdl5_e7e9ca3a-1238-49eb-be83-c342ccbacce4/kube-rbac-proxy/0.log" Apr 21 15:36:26.219943 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.219855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" event={"ID":"a1c5802a-6473-41b2-bc16-71d347b210f5","Type":"ContainerStarted","Data":"d8a0ab7b21f88e73804f9b4d966b02fe312171efe7ad3c7e2b8174f2e5bb9cb9"} Apr 21 15:36:26.220397 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.220075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:26.226617 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.226575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" Apr 21 15:36:26.241606 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.241553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9ntvk" podStartSLOduration=1.160975863 podStartE2EDuration="2.241535787s" podCreationTimestamp="2026-04-21 15:36:24 +0000 UTC" firstStartedPulling="2026-04-21 15:36:24.852889941 +0000 UTC m=+73.570967267" lastFinishedPulling="2026-04-21 15:36:25.933449851 +0000 UTC m=+74.651527191" observedRunningTime="2026-04-21 15:36:26.240604301 +0000 UTC m=+74.958681650" watchObservedRunningTime="2026-04-21 15:36:26.241535787 +0000 UTC m=+74.959613136" Apr 21 15:36:26.479050 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.478961 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zzs27"] Apr 21 15:36:26.483803 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.483777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.488776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.488744 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 15:36:26.488776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.488752 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 15:36:26.489005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.488846 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:36:26.490117 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.489889 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-d88xj\"" Apr 21 15:36:26.491863 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.491838 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zzs27"] Apr 21 15:36:26.642627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.642582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.642818 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.642647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.642818 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.642754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.642818 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.642809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6s65\" (UniqueName: \"kubernetes.io/projected/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-kube-api-access-h6s65\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.743752 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.743640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.743752 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.743708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.743752 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.743749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6s65\" (UniqueName: \"kubernetes.io/projected/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-kube-api-access-h6s65\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.744038 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.743824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.744038 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:26.743855 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 15:36:26.744038 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:26.743934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-tls podName:7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:27.243913108 +0000 UTC m=+75.961990434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-zzs27" (UID: "7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9") : secret "prometheus-operator-tls" not found Apr 21 15:36:26.744585 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.744530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.745727 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.745707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wl2ck_e7197e20-bb57-4167-b435-1446351d6727/dns-node-resolver/0.log" Apr 21 15:36:26.746691 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.746668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:26.761860 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:26.761832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6s65\" (UniqueName: \"kubernetes.io/projected/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-kube-api-access-h6s65\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:27.132212 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.132176 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-68568f786f-8pz2q_309490ab-206f-4ed9-9045-5effcdd68f2a/registry/0.log" Apr 21 15:36:27.249367 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.249340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:27.251732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.251707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zzs27\" (UID: \"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:27.395724 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.395633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" Apr 21 15:36:27.518166 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.518135 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zzs27"] Apr 21 15:36:27.520758 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:27.520728 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8e0d7d_a3b6_4a7d_91d5_b2d10859b1f9.slice/crio-67c26c6090473cceda689e82b3a8a219bb1d50df50dabd946c52e2466c559a5e WatchSource:0}: Error finding container 67c26c6090473cceda689e82b3a8a219bb1d50df50dabd946c52e2466c559a5e: Status 404 returned error can't find the container with id 67c26c6090473cceda689e82b3a8a219bb1d50df50dabd946c52e2466c559a5e Apr 21 15:36:27.648339 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.648259 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c955d4848-qs29m"] Apr 21 15:36:27.652828 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.652802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.662685 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.661057 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c955d4848-qs29m"] Apr 21 15:36:27.665083 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.665008 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 15:36:27.731449 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.731422 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b59t6_785c95fa-5c55-4ea7-8b31-adcc8f22c2e2/node-ca/0.log" Apr 21 15:36:27.753869 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.753823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-oauth-serving-cert\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.754053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.753881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-console-config\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.754053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.753913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwx5w\" (UniqueName: \"kubernetes.io/projected/0ab4e661-8ff7-460f-ab72-609d40571aad-kube-api-access-hwx5w\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.754053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.753945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-oauth-config\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.754053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.753962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-service-ca\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.754219 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.754084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-trusted-ca-bundle\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.754219 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.754125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-serving-cert\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.854634 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-oauth-serving-cert\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.854634 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-console-config\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.854885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwx5w\" (UniqueName: \"kubernetes.io/projected/0ab4e661-8ff7-460f-ab72-609d40571aad-kube-api-access-hwx5w\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.854885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-oauth-config\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.854885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-service-ca\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.854885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-trusted-ca-bundle\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.855086 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.854956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-serving-cert\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.855483 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.855455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-console-config\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.855601 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.855504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-service-ca\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.855601 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.855576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-oauth-serving-cert\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.855723 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.855660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-trusted-ca-bundle\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.857215 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.857197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-oauth-config\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.857294 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.857283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-serving-cert\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.864818 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.864798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwx5w\" (UniqueName: \"kubernetes.io/projected/0ab4e661-8ff7-460f-ab72-609d40571aad-kube-api-access-hwx5w\") pod \"console-6c955d4848-qs29m\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:27.966870 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:27.966781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:28.101601 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.101562 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c955d4848-qs29m"] Apr 21 15:36:28.105004 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:28.104962 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab4e661_8ff7_460f_ab72_609d40571aad.slice/crio-573a3aa90714e7464be483ecfcd615a1a8c65ce863f0c96ec597c11928575901 WatchSource:0}: Error finding container 573a3aa90714e7464be483ecfcd615a1a8c65ce863f0c96ec597c11928575901: Status 404 returned error can't find the container with id 573a3aa90714e7464be483ecfcd615a1a8c65ce863f0c96ec597c11928575901 Apr 21 15:36:28.228700 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.228133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7578f8dd75-wkzk9" event={"ID":"be214c51-9db0-4029-aed5-20b000e89c05","Type":"ContainerStarted","Data":"24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20"} Apr 21 15:36:28.230732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.230696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c955d4848-qs29m" event={"ID":"0ab4e661-8ff7-460f-ab72-609d40571aad","Type":"ContainerStarted","Data":"33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f"} Apr 21 15:36:28.230870 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.230739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c955d4848-qs29m" event={"ID":"0ab4e661-8ff7-460f-ab72-609d40571aad","Type":"ContainerStarted","Data":"573a3aa90714e7464be483ecfcd615a1a8c65ce863f0c96ec597c11928575901"} Apr 21 15:36:28.232213 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.232186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" event={"ID":"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9","Type":"ContainerStarted","Data":"67c26c6090473cceda689e82b3a8a219bb1d50df50dabd946c52e2466c559a5e"} Apr 21 15:36:28.253028 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.252968 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7578f8dd75-wkzk9" podStartSLOduration=2.429148829 podStartE2EDuration="5.252946439s" podCreationTimestamp="2026-04-21 15:36:23 +0000 UTC" firstStartedPulling="2026-04-21 15:36:24.398779953 +0000 UTC m=+73.116857286" lastFinishedPulling="2026-04-21 15:36:27.222577567 +0000 UTC m=+75.940654896" observedRunningTime="2026-04-21 15:36:28.25140114 +0000 UTC m=+76.969478488" watchObservedRunningTime="2026-04-21 15:36:28.252946439 +0000 UTC m=+76.971023788" Apr 21 15:36:28.274543 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.274463 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c955d4848-qs29m" podStartSLOduration=1.274441144 podStartE2EDuration="1.274441144s" podCreationTimestamp="2026-04-21 15:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:36:28.272198665 +0000 UTC m=+76.990276038" watchObservedRunningTime="2026-04-21 15:36:28.274441144 +0000 UTC m=+76.992518493" Apr 21 15:36:28.531688 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:28.531594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n554q_94cdcb75-41df-488a-9f65-1dcac041f00e/serve-healthcheck-canary/0.log" Apr 21 15:36:29.237714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:29.237668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" event={"ID":"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9","Type":"ContainerStarted","Data":"af78ddade81dd1410412d83c6f79c61ca0377140c30676eea91e1e7505edff1f"} Apr 21 15:36:29.237714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:29.237717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" event={"ID":"7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9","Type":"ContainerStarted","Data":"8b1feea13b8837ae290637a35b2ce3f83dc9d69f6024d49f89d0b0e543080f65"} Apr 21 15:36:29.257288 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:29.257232 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-zzs27" podStartSLOduration=2.010911145 podStartE2EDuration="3.257214866s" podCreationTimestamp="2026-04-21 15:36:26 +0000 UTC" firstStartedPulling="2026-04-21 15:36:27.522908374 +0000 UTC m=+76.240985704" lastFinishedPulling="2026-04-21 15:36:28.769212087 +0000 UTC m=+77.487289425" observedRunningTime="2026-04-21 15:36:29.255685932 +0000 UTC m=+77.973763279" watchObservedRunningTime="2026-04-21 15:36:29.257214866 +0000 UTC m=+77.975292214" Apr 21 15:36:30.954660 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.954616 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-blplp"] Apr 21 15:36:30.979019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.978991 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tzxlg"] Apr 21 15:36:30.979186 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.979161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:30.981996 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.981970 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-p2gdk\"" Apr 21 15:36:30.981996 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.981984 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 15:36:30.983172 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.983154 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 15:36:30.983355 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.983340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 15:36:30.994857 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.994835 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-blplp"] Apr 21 15:36:30.994986 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.994943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:30.998239 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.998218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:36:30.998728 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:30.998678 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:36:31.014747 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.014726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mk69c\"" Apr 21 15:36:31.014999 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.014986 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:36:31.085706 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-wtmp\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.085854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.085854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjc5j\" (UniqueName: \"kubernetes.io/projected/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-kube-api-access-qjc5j\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.085854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-sys\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.085854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.085993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.085993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c884d\" (UniqueName: \"kubernetes.io/projected/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-api-access-c884d\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.085993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.085936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-accelerators-collector-config\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.086081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-tls\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.086081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.086140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-root\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.086140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.086140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-textfile\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.086234 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.086234 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.086192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-metrics-client-ca\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.186667 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-metrics-client-ca\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.186667 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-wtmp\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.186919 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.186919 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjc5j\" (UniqueName: \"kubernetes.io/projected/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-kube-api-access-qjc5j\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.186919 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-sys\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.186919 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.186919 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-wtmp\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-sys\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.187153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.186998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c884d\" (UniqueName: \"kubernetes.io/projected/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-api-access-c884d\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.187153 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:31.187025 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 15:36:31.187153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-accelerators-collector-config\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187153 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:31.187096 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls podName:e7b09aa4-22f8-4fa3-b7cb-08b261441aff nodeName:}" failed. No retries permitted until 2026-04-21 15:36:31.687074908 +0000 UTC m=+80.405152251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-blplp" (UID: "e7b09aa4-22f8-4fa3-b7cb-08b261441aff") : secret "kube-state-metrics-tls" not found Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-tls\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-root\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:31.187287 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-metrics-client-ca\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:31.187338 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-tls podName:d9c8569a-9c0f-45ef-b5b3-6d6893c96750 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:31.687320204 +0000 UTC m=+80.405397549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-tls") pod "node-exporter-tzxlg" (UID: "d9c8569a-9c0f-45ef-b5b3-6d6893c96750") : secret "node-exporter-tls" not found Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-textfile\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.187439 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-root\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187866 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.187866 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.187866 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-textfile\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187866 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-accelerators-collector-config\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.187983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.187948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.189701 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.189683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.190380 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.189766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.201911 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.201872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c884d\" (UniqueName: \"kubernetes.io/projected/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-api-access-c884d\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.202599 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.202574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjc5j\" (UniqueName: \"kubernetes.io/projected/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-kube-api-access-qjc5j\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.690901 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.690868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:31.691081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.690907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-tls\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.691081 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:31.691030 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 15:36:31.691206 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:31.691102 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls podName:e7b09aa4-22f8-4fa3-b7cb-08b261441aff nodeName:}" failed. No retries permitted until 2026-04-21 15:36:32.69108397 +0000 UTC m=+81.409161296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-blplp" (UID: "e7b09aa4-22f8-4fa3-b7cb-08b261441aff") : secret "kube-state-metrics-tls" not found Apr 21 15:36:31.693309 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.693281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9c8569a-9c0f-45ef-b5b3-6d6893c96750-node-exporter-tls\") pod \"node-exporter-tzxlg\" (UID: \"d9c8569a-9c0f-45ef-b5b3-6d6893c96750\") " pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.826462 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.826424 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5z45k"] Apr 21 15:36:31.832404 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.832379 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:31.836643 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.836622 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:36:31.838021 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.837615 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:36:31.838826 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.838631 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dcdll\"" Apr 21 15:36:31.839894 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.839870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5z45k"] Apr 21 15:36:31.903628 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.903592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tzxlg" Apr 21 15:36:31.916395 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:31.916361 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c8569a_9c0f_45ef_b5b3_6d6893c96750.slice/crio-2ac55796feda9d056c21c070476de5fbb0470572491f9b0533fad61a3ce6eab2 WatchSource:0}: Error finding container 2ac55796feda9d056c21c070476de5fbb0470572491f9b0533fad61a3ce6eab2: Status 404 returned error can't find the container with id 2ac55796feda9d056c21c070476de5fbb0470572491f9b0533fad61a3ce6eab2 Apr 21 15:36:31.993604 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.993513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-data-volume\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:31.993604 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.993562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:31.993993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.993620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-crio-socket\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:31.993993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.993655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:31.993993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:31.993711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vk4\" (UniqueName: \"kubernetes.io/projected/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-kube-api-access-56vk4\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.094983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.094945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-data-volume\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.094983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.094982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.095222 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.095006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-crio-socket\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.095222 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.095037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.095222 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.095091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56vk4\" (UniqueName: \"kubernetes.io/projected/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-kube-api-access-56vk4\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.095222 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.095109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-crio-socket\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.095389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:32.095216 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 15:36:32.095389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:36:32.095325 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-insights-runtime-extractor-tls podName:19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed nodeName:}" failed. No retries permitted until 2026-04-21 15:36:32.595301043 +0000 UTC m=+81.313378411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5z45k" (UID: "19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed") : secret "insights-runtime-extractor-tls" not found Apr 21 15:36:32.095482 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.095454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-data-volume\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.095650 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.095634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.110104 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.110078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vk4\" (UniqueName: \"kubernetes.io/projected/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-kube-api-access-56vk4\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.247410 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.247326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tzxlg" event={"ID":"d9c8569a-9c0f-45ef-b5b3-6d6893c96750","Type":"ContainerStarted","Data":"2ac55796feda9d056c21c070476de5fbb0470572491f9b0533fad61a3ce6eab2"} Apr 21 15:36:32.601157 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.601113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.603595 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.603562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5z45k\" (UID: \"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed\") " pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.704436 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.704388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:32.706909 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.706882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7b09aa4-22f8-4fa3-b7cb-08b261441aff-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-blplp\" (UID: \"e7b09aa4-22f8-4fa3-b7cb-08b261441aff\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:32.743865 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.743831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5z45k" Apr 21 15:36:32.789702 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.789655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" Apr 21 15:36:32.894899 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.894848 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5z45k"] Apr 21 15:36:32.910698 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:32.910648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19bbbe8b_bd00_4795_ab06_e9b9e3cb2eed.slice/crio-7741664cb488cd1e552d00509e80859ccb7ed23b779a7e3f5a00d17fe3476645 WatchSource:0}: Error finding container 7741664cb488cd1e552d00509e80859ccb7ed23b779a7e3f5a00d17fe3476645: Status 404 returned error can't find the container with id 7741664cb488cd1e552d00509e80859ccb7ed23b779a7e3f5a00d17fe3476645 Apr 21 15:36:32.940483 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.940458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-blplp"] Apr 21 15:36:32.942739 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:32.942705 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b09aa4_22f8_4fa3_b7cb_08b261441aff.slice/crio-e933d85da092c3adaf8faa8ee77d531948d13d785800ffad01c3eb9357ca9869 WatchSource:0}: Error finding container e933d85da092c3adaf8faa8ee77d531948d13d785800ffad01c3eb9357ca9869: Status 404 returned error can't find the container with id e933d85da092c3adaf8faa8ee77d531948d13d785800ffad01c3eb9357ca9869 Apr 21 15:36:32.992621 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.992585 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6c7748f595-45kkr"] Apr 21 15:36:32.996565 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:32.996536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.000546 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.000520 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 15:36:33.000751 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.000729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 15:36:33.000870 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.000558 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 15:36:33.000982 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.000964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 15:36:33.001045 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.000597 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-xpqvp\"" Apr 21 15:36:33.001102 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.000575 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 15:36:33.001198 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.001175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-df6iaok9ekoan\"" Apr 21 15:36:33.016929 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.016896 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c7748f595-45kkr"] Apr 21 15:36:33.109775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.109775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-tls\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.109775 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.110018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81d61fd2-e12a-4415-9e98-d8d556cb75f4-metrics-client-ca\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.110018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.110018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.110018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.109896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-grpc-tls\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.110018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.110000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnp5l\" (UniqueName: \"kubernetes.io/projected/81d61fd2-e12a-4415-9e98-d8d556cb75f4-kube-api-access-vnp5l\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211114 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnp5l\" (UniqueName: \"kubernetes.io/projected/81d61fd2-e12a-4415-9e98-d8d556cb75f4-kube-api-access-vnp5l\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-tls\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81d61fd2-e12a-4415-9e98-d8d556cb75f4-metrics-client-ca\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.211331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-grpc-tls\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.212366 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.211910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9wdl5" Apr 21 15:36:33.212366 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.212322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81d61fd2-e12a-4415-9e98-d8d556cb75f4-metrics-client-ca\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.214383 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.214354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.214538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.214411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.214538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.214531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.214678 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.214628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-tls\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.214873 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.214852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.215293 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.215273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81d61fd2-e12a-4415-9e98-d8d556cb75f4-secret-grpc-tls\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.223040 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.223014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnp5l\" (UniqueName: \"kubernetes.io/projected/81d61fd2-e12a-4415-9e98-d8d556cb75f4-kube-api-access-vnp5l\") pod \"thanos-querier-6c7748f595-45kkr\" (UID: \"81d61fd2-e12a-4415-9e98-d8d556cb75f4\") " pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.252954 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.252916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" event={"ID":"e7b09aa4-22f8-4fa3-b7cb-08b261441aff","Type":"ContainerStarted","Data":"e933d85da092c3adaf8faa8ee77d531948d13d785800ffad01c3eb9357ca9869"} Apr 21 15:36:33.254699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.254672 2576 generic.go:358] "Generic (PLEG): container finished" podID="d9c8569a-9c0f-45ef-b5b3-6d6893c96750" containerID="5e1cc21e0ad118d96a66b0dfb6b79fe321538aa4fa5530c2f134e47af0c8eeda" exitCode=0 Apr 21 15:36:33.254842 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.254753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tzxlg" event={"ID":"d9c8569a-9c0f-45ef-b5b3-6d6893c96750","Type":"ContainerDied","Data":"5e1cc21e0ad118d96a66b0dfb6b79fe321538aa4fa5530c2f134e47af0c8eeda"} Apr 21 15:36:33.256309 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.256289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5z45k" event={"ID":"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed","Type":"ContainerStarted","Data":"82f3a701095147923000dbcf4d8a3b351cd11feb00b151204487fa28b5bfa4bc"} Apr 21 15:36:33.256431 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.256313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5z45k" event={"ID":"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed","Type":"ContainerStarted","Data":"7741664cb488cd1e552d00509e80859ccb7ed23b779a7e3f5a00d17fe3476645"} Apr 21 15:36:33.309354 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.309324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:33.480935 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:33.480903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c7748f595-45kkr"] Apr 21 15:36:33.492291 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:33.492247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d61fd2_e12a_4415_9e98_d8d556cb75f4.slice/crio-11813b3953eb667e01ed9f31ff64676abc35f1860066312255aa8ee30862a925 WatchSource:0}: Error finding container 11813b3953eb667e01ed9f31ff64676abc35f1860066312255aa8ee30862a925: Status 404 returned error can't find the container with id 11813b3953eb667e01ed9f31ff64676abc35f1860066312255aa8ee30862a925 Apr 21 15:36:34.250460 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.250391 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:34.250939 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.250470 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:34.257743 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.257440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:34.269378 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.269310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" event={"ID":"e7b09aa4-22f8-4fa3-b7cb-08b261441aff","Type":"ContainerStarted","Data":"30d39e09c3a64eae0e0beeddc0f4c6c2cc2b37a1ca25afcbf920b613ce58f2b3"} Apr 21 15:36:34.271879 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.271808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"11813b3953eb667e01ed9f31ff64676abc35f1860066312255aa8ee30862a925"} Apr 21 15:36:34.276962 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.276934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tzxlg" event={"ID":"d9c8569a-9c0f-45ef-b5b3-6d6893c96750","Type":"ContainerStarted","Data":"312ed76529e97a2c8cb8886ebe09be7fc08e2d400204d65b8a77bec020bd3851"} Apr 21 15:36:34.277327 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.277306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tzxlg" event={"ID":"d9c8569a-9c0f-45ef-b5b3-6d6893c96750","Type":"ContainerStarted","Data":"3d92cfac68b488a2ad049d0045257248317bd26265859bb6f8e4ae9ffe5a228d"} Apr 21 15:36:34.284921 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.284884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5z45k" event={"ID":"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed","Type":"ContainerStarted","Data":"98fdf113e77205937daa885c926a5b32c5edbefa33dca7f9bc125a5ec8bc0d9e"} Apr 21 15:36:34.291757 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.291703 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:36:34.386930 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:34.386723 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tzxlg" podStartSLOduration=3.646279027 podStartE2EDuration="4.386700074s" podCreationTimestamp="2026-04-21 15:36:30 +0000 UTC" firstStartedPulling="2026-04-21 15:36:31.918449895 +0000 UTC m=+80.636527221" lastFinishedPulling="2026-04-21 15:36:32.658870942 +0000 UTC m=+81.376948268" observedRunningTime="2026-04-21 15:36:34.381989167 +0000 UTC m=+83.100066539" watchObservedRunningTime="2026-04-21 15:36:34.386700074 +0000 UTC m=+83.104777423" Apr 21 15:36:35.290160 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:35.290018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" event={"ID":"e7b09aa4-22f8-4fa3-b7cb-08b261441aff","Type":"ContainerStarted","Data":"f543610b89fb8bbc5029997c1aa8f529f621f93fa94eab769de8ef17fcbbb555"} Apr 21 15:36:35.290160 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:35.290068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" event={"ID":"e7b09aa4-22f8-4fa3-b7cb-08b261441aff","Type":"ContainerStarted","Data":"cd890097eede49a39d33fcb580e189b05d1d9cae19d6de8d801bd2475520cc54"} Apr 21 15:36:35.314414 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:35.314339 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-blplp" podStartSLOduration=4.153345449 podStartE2EDuration="5.314316066s" podCreationTimestamp="2026-04-21 15:36:30 +0000 UTC" firstStartedPulling="2026-04-21 15:36:32.944715001 +0000 UTC m=+81.662792331" lastFinishedPulling="2026-04-21 15:36:34.105685622 +0000 UTC m=+82.823762948" observedRunningTime="2026-04-21 15:36:35.313063541 +0000 UTC m=+84.031140890" watchObservedRunningTime="2026-04-21 15:36:35.314316066 +0000 UTC m=+84.032393415" Apr 21 15:36:35.754014 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:35.753977 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c955d4848-qs29m"] Apr 21 15:36:36.295118 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:36.295009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5z45k" event={"ID":"19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed","Type":"ContainerStarted","Data":"bfe20c92f2261b1f8a61a28eddfffe8bff0f97fdfe74abe59b004c1822998c17"} Apr 21 15:36:36.301764 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:36.301728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"f521f997af99879078f30aad735d60cfbf5823320b6303c8ae28de18a331896d"} Apr 21 15:36:36.301915 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:36.301770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"775234c567f1013178ded106ebe27183b7f1f37be23d4606663faefc316bdc91"} Apr 21 15:36:36.301915 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:36.301784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"0b554a3d7c4c5cd3e30b25e35cfe1fb1a032e5cdcdfba9847feec2243b84480c"} Apr 21 15:36:36.324144 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:36.324078 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5z45k" podStartSLOduration=2.407150663 podStartE2EDuration="5.324054024s" podCreationTimestamp="2026-04-21 15:36:31 +0000 UTC" firstStartedPulling="2026-04-21 15:36:32.981689425 +0000 UTC m=+81.699766752" lastFinishedPulling="2026-04-21 15:36:35.898592769 +0000 UTC m=+84.616670113" observedRunningTime="2026-04-21 15:36:36.321936552 +0000 UTC m=+85.040013900" watchObservedRunningTime="2026-04-21 15:36:36.324054024 +0000 UTC m=+85.042131467" Apr 21 15:36:37.309918 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:37.309871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"800f6b69eb75eb0cb0b656d35da35192d29fd8ca1bab3301465af54f21f325dd"} Apr 21 15:36:37.310465 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:37.309924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"b506fb01608c8244d6d080b1c6008c65d88746a367928004604c767a043371bf"} Apr 21 15:36:37.310465 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:37.309941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" event={"ID":"81d61fd2-e12a-4415-9e98-d8d556cb75f4","Type":"ContainerStarted","Data":"d12ca1b8d156219c375b215e1a968eed9399bd915908e83485cdad7c4e5f0ebf"} Apr 21 15:36:37.338553 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:37.338468 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" podStartSLOduration=2.004398073 podStartE2EDuration="5.338447533s" podCreationTimestamp="2026-04-21 15:36:32 +0000 UTC" firstStartedPulling="2026-04-21 15:36:33.494450328 +0000 UTC m=+82.212527653" lastFinishedPulling="2026-04-21 15:36:36.828499772 +0000 UTC m=+85.546577113" observedRunningTime="2026-04-21 15:36:37.336656364 +0000 UTC m=+86.054733713" watchObservedRunningTime="2026-04-21 15:36:37.338447533 +0000 UTC m=+86.056524880" Apr 21 15:36:37.967426 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:37.967390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:36:38.150054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:38.150023 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:36:38.313725 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:38.313692 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:40.711418 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.711383 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-849b4f55f7-wwzzh"] Apr 21 15:36:40.715948 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.715925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.727615 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.727585 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-849b4f55f7-wwzzh"] Apr 21 15:36:40.788374 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-oauth-serving-cert\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.788569 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-trusted-ca-bundle\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.788569 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-config\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.788569 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-serving-cert\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.788569 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-service-ca\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.788740 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp96w\" (UniqueName: \"kubernetes.io/projected/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-kube-api-access-bp96w\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.788740 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.788598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-oauth-config\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889007 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.888964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-config\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889209 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-serving-cert\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889209 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-service-ca\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889209 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp96w\" (UniqueName: \"kubernetes.io/projected/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-kube-api-access-bp96w\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889209 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-oauth-config\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889209 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-oauth-serving-cert\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889209 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-trusted-ca-bundle\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889824 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-config\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889968 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-service-ca\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.889968 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.889897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-oauth-serving-cert\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.890281 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.890261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-trusted-ca-bundle\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.891764 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.891742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-serving-cert\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.891869 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.891830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-oauth-config\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:40.898709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:40.898689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp96w\" (UniqueName: \"kubernetes.io/projected/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-kube-api-access-bp96w\") pod \"console-849b4f55f7-wwzzh\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:41.025933 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:41.025852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:41.156003 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:41.155976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-849b4f55f7-wwzzh"] Apr 21 15:36:41.158898 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:36:41.158872 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87358ab0_ee7e_4f54_850e_cc2bfc410aeb.slice/crio-2a9ed7960c2dc5018b568b8767ff0d0e92efcaea75599ab9d438eba0b00d931e WatchSource:0}: Error finding container 2a9ed7960c2dc5018b568b8767ff0d0e92efcaea75599ab9d438eba0b00d931e: Status 404 returned error can't find the container with id 2a9ed7960c2dc5018b568b8767ff0d0e92efcaea75599ab9d438eba0b00d931e Apr 21 15:36:41.324662 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:41.324624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849b4f55f7-wwzzh" event={"ID":"87358ab0-ee7e-4f54-850e-cc2bfc410aeb","Type":"ContainerStarted","Data":"0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f"} Apr 21 15:36:41.324662 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:41.324661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849b4f55f7-wwzzh" event={"ID":"87358ab0-ee7e-4f54-850e-cc2bfc410aeb","Type":"ContainerStarted","Data":"2a9ed7960c2dc5018b568b8767ff0d0e92efcaea75599ab9d438eba0b00d931e"} Apr 21 15:36:41.346296 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:41.346246 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-849b4f55f7-wwzzh" podStartSLOduration=1.346229436 podStartE2EDuration="1.346229436s" podCreationTimestamp="2026-04-21 15:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:36:41.34421225 +0000 UTC m=+90.062289610" watchObservedRunningTime="2026-04-21 15:36:41.346229436 +0000 UTC m=+90.064306784" Apr 21 15:36:42.872241 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:42.872205 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68568f786f-8pz2q"] Apr 21 15:36:44.322615 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:44.322587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6c7748f595-45kkr" Apr 21 15:36:51.026160 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:51.026119 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:51.026160 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:51.026164 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:51.031019 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:51.030994 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:51.358102 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:51.358073 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:36:51.418832 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:36:51.418792 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7578f8dd75-wkzk9"] Apr 21 15:37:00.777637 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:00.777568 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c955d4848-qs29m" podUID="0ab4e661-8ff7-460f-ab72-609d40571aad" containerName="console" containerID="cri-o://33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f" gracePeriod=15 Apr 21 15:37:01.021144 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.021112 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c955d4848-qs29m_0ab4e661-8ff7-460f-ab72-609d40571aad/console/0.log" Apr 21 15:37:01.021261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.021186 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:37:01.056034 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056000 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-service-ca\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056230 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056065 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-console-config\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056230 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056108 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwx5w\" (UniqueName: \"kubernetes.io/projected/0ab4e661-8ff7-460f-ab72-609d40571aad-kube-api-access-hwx5w\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056230 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056168 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-serving-cert\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056230 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056196 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-oauth-config\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056230 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056227 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-oauth-serving-cert\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056520 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056250 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-trusted-ca-bundle\") pod \"0ab4e661-8ff7-460f-ab72-609d40571aad\" (UID: \"0ab4e661-8ff7-460f-ab72-609d40571aad\") " Apr 21 15:37:01.056520 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056351 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-service-ca" (OuterVolumeSpecName: "service-ca") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:01.056631 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056517 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-console-config" (OuterVolumeSpecName: "console-config") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:01.056749 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056729 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-service-ca\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.056814 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056754 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-console-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.056869 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056843 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:01.056972 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.056951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:01.058724 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.058693 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:01.058849 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.058826 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab4e661-8ff7-460f-ab72-609d40571aad-kube-api-access-hwx5w" (OuterVolumeSpecName: "kube-api-access-hwx5w") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "kube-api-access-hwx5w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:37:01.059064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.059035 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0ab4e661-8ff7-460f-ab72-609d40571aad" (UID: "0ab4e661-8ff7-460f-ab72-609d40571aad"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:01.157801 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.157758 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.157801 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.157796 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ab4e661-8ff7-460f-ab72-609d40571aad-console-oauth-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.157801 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.157806 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-oauth-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.157801 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.157815 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ab4e661-8ff7-460f-ab72-609d40571aad-trusted-ca-bundle\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.158076 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.157826 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwx5w\" (UniqueName: \"kubernetes.io/projected/0ab4e661-8ff7-460f-ab72-609d40571aad-kube-api-access-hwx5w\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:01.385263 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.385179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c955d4848-qs29m_0ab4e661-8ff7-460f-ab72-609d40571aad/console/0.log" Apr 21 15:37:01.385263 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.385219 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ab4e661-8ff7-460f-ab72-609d40571aad" containerID="33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f" exitCode=2 Apr 21 15:37:01.385531 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.385255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c955d4848-qs29m" event={"ID":"0ab4e661-8ff7-460f-ab72-609d40571aad","Type":"ContainerDied","Data":"33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f"} Apr 21 15:37:01.385531 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.385288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c955d4848-qs29m" Apr 21 15:37:01.385531 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.385306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c955d4848-qs29m" event={"ID":"0ab4e661-8ff7-460f-ab72-609d40571aad","Type":"ContainerDied","Data":"573a3aa90714e7464be483ecfcd615a1a8c65ce863f0c96ec597c11928575901"} Apr 21 15:37:01.385531 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.385327 2576 scope.go:117] "RemoveContainer" containerID="33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f" Apr 21 15:37:01.393967 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.393944 2576 scope.go:117] "RemoveContainer" containerID="33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f" Apr 21 15:37:01.394252 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:37:01.394231 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f\": container with ID starting with 33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f not found: ID does not exist" containerID="33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f" Apr 21 15:37:01.394336 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.394262 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f"} err="failed to get container status \"33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f\": rpc error: code = NotFound desc = could not find container \"33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f\": container with ID starting with 33076cce16807a75c4577c3883535dea8a48a6fb3aa1d74da0945049e279fa9f not found: ID does not exist" Apr 21 15:37:01.408558 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.408482 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c955d4848-qs29m"] Apr 21 15:37:01.415976 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.415952 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c955d4848-qs29m"] Apr 21 15:37:01.805140 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:01.805106 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab4e661-8ff7-460f-ab72-609d40571aad" path="/var/lib/kubelet/pods/0ab4e661-8ff7-460f-ab72-609d40571aad/volumes" Apr 21 15:37:07.892778 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:07.892735 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" podUID="309490ab-206f-4ed9-9045-5effcdd68f2a" containerName="registry" containerID="cri-o://9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621" gracePeriod=30 Apr 21 15:37:08.138875 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.138848 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:37:08.219517 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219389 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-installation-pull-secrets\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219517 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219517 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219461 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-bound-sa-token\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-certificates\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219721 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-image-registry-private-configuration\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219750 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309490ab-206f-4ed9-9045-5effcdd68f2a-ca-trust-extracted\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219811 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-trusted-ca\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.219885 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.219857 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4flc\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-kube-api-access-x4flc\") pod \"309490ab-206f-4ed9-9045-5effcdd68f2a\" (UID: \"309490ab-206f-4ed9-9045-5effcdd68f2a\") " Apr 21 15:37:08.220134 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.220082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:08.220224 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.220202 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-certificates\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.220345 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.220309 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:08.222197 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.222164 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:08.222412 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.222380 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:08.222519 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.222423 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:37:08.222519 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.222451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:37:08.222640 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.222624 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-kube-api-access-x4flc" (OuterVolumeSpecName: "kube-api-access-x4flc") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "kube-api-access-x4flc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:37:08.228098 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.228063 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/309490ab-206f-4ed9-9045-5effcdd68f2a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "309490ab-206f-4ed9-9045-5effcdd68f2a" (UID: "309490ab-206f-4ed9-9045-5effcdd68f2a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:37:08.320802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320763 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309490ab-206f-4ed9-9045-5effcdd68f2a-trusted-ca\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.320802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320794 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4flc\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-kube-api-access-x4flc\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.320802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320805 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-installation-pull-secrets\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.320802 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320815 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-registry-tls\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.321081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320824 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309490ab-206f-4ed9-9045-5effcdd68f2a-bound-sa-token\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.321081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320836 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/309490ab-206f-4ed9-9045-5effcdd68f2a-image-registry-private-configuration\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.321081 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.320848 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309490ab-206f-4ed9-9045-5effcdd68f2a-ca-trust-extracted\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:08.408683 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.408643 2576 generic.go:358] "Generic (PLEG): container finished" podID="309490ab-206f-4ed9-9045-5effcdd68f2a" containerID="9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621" exitCode=0 Apr 21 15:37:08.408858 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.408716 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" Apr 21 15:37:08.408858 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.408735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" event={"ID":"309490ab-206f-4ed9-9045-5effcdd68f2a","Type":"ContainerDied","Data":"9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621"} Apr 21 15:37:08.408858 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.408778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68568f786f-8pz2q" event={"ID":"309490ab-206f-4ed9-9045-5effcdd68f2a","Type":"ContainerDied","Data":"08d89baabbca8b9f98730d833b057807050530b67ff5cc8f8d2e61f71267efc8"} Apr 21 15:37:08.408858 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.408795 2576 scope.go:117] "RemoveContainer" containerID="9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621" Apr 21 15:37:08.418426 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.418391 2576 scope.go:117] "RemoveContainer" containerID="9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621" Apr 21 15:37:08.419239 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:37:08.419212 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621\": container with ID starting with 9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621 not found: ID does not exist" containerID="9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621" Apr 21 15:37:08.419349 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.419248 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621"} err="failed to get container status \"9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621\": rpc error: code = NotFound desc = could not find container \"9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621\": container with ID starting with 9c8d3b71414fda293b67e60c79262d474893f446ff67428e47605b3cee1b5621 not found: ID does not exist" Apr 21 15:37:08.444629 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.444592 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68568f786f-8pz2q"] Apr 21 15:37:08.453250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:08.453221 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68568f786f-8pz2q"] Apr 21 15:37:09.801942 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:09.801905 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309490ab-206f-4ed9-9045-5effcdd68f2a" path="/var/lib/kubelet/pods/309490ab-206f-4ed9-9045-5effcdd68f2a/volumes" Apr 21 15:37:13.425811 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:13.425770 2576 generic.go:358] "Generic (PLEG): container finished" podID="8287ddd5-c147-400c-b1e7-382801765df6" containerID="f782612981949c70105200660c637163eaf47a860051a09171e5b5842a742999" exitCode=0 Apr 21 15:37:13.426228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:13.425844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" event={"ID":"8287ddd5-c147-400c-b1e7-382801765df6","Type":"ContainerDied","Data":"f782612981949c70105200660c637163eaf47a860051a09171e5b5842a742999"} Apr 21 15:37:13.426228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:13.426204 2576 scope.go:117] "RemoveContainer" containerID="f782612981949c70105200660c637163eaf47a860051a09171e5b5842a742999" Apr 21 15:37:14.432965 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:14.432905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fcfg7" event={"ID":"8287ddd5-c147-400c-b1e7-382801765df6","Type":"ContainerStarted","Data":"b3d0ff40dc9ae637137ba7a9f69c03d23677febc62ae4e09037b08a88add6a12"} Apr 21 15:37:16.444331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.444241 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7578f8dd75-wkzk9" podUID="be214c51-9db0-4029-aed5-20b000e89c05" containerName="console" containerID="cri-o://24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20" gracePeriod=15 Apr 21 15:37:16.699649 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.699589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7578f8dd75-wkzk9_be214c51-9db0-4029-aed5-20b000e89c05/console/0.log" Apr 21 15:37:16.699757 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.699653 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:37:16.791679 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.791645 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4xk4\" (UniqueName: \"kubernetes.io/projected/be214c51-9db0-4029-aed5-20b000e89c05-kube-api-access-x4xk4\") pod \"be214c51-9db0-4029-aed5-20b000e89c05\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " Apr 21 15:37:16.791679 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.791680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-oauth-serving-cert\") pod \"be214c51-9db0-4029-aed5-20b000e89c05\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " Apr 21 15:37:16.791906 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.791817 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-oauth-config\") pod \"be214c51-9db0-4029-aed5-20b000e89c05\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " Apr 21 15:37:16.791906 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.791856 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-serving-cert\") pod \"be214c51-9db0-4029-aed5-20b000e89c05\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " Apr 21 15:37:16.791906 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.791893 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-console-config\") pod \"be214c51-9db0-4029-aed5-20b000e89c05\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " Apr 21 15:37:16.792030 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.791930 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-service-ca\") pod \"be214c51-9db0-4029-aed5-20b000e89c05\" (UID: \"be214c51-9db0-4029-aed5-20b000e89c05\") " Apr 21 15:37:16.792113 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.792092 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "be214c51-9db0-4029-aed5-20b000e89c05" (UID: "be214c51-9db0-4029-aed5-20b000e89c05"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:16.792294 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.792271 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-oauth-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:16.792413 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.792335 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-console-config" (OuterVolumeSpecName: "console-config") pod "be214c51-9db0-4029-aed5-20b000e89c05" (UID: "be214c51-9db0-4029-aed5-20b000e89c05"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:16.792413 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.792396 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-service-ca" (OuterVolumeSpecName: "service-ca") pod "be214c51-9db0-4029-aed5-20b000e89c05" (UID: "be214c51-9db0-4029-aed5-20b000e89c05"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:16.793979 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.793945 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "be214c51-9db0-4029-aed5-20b000e89c05" (UID: "be214c51-9db0-4029-aed5-20b000e89c05"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:16.793979 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.793964 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "be214c51-9db0-4029-aed5-20b000e89c05" (UID: "be214c51-9db0-4029-aed5-20b000e89c05"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:16.794103 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.794039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be214c51-9db0-4029-aed5-20b000e89c05-kube-api-access-x4xk4" (OuterVolumeSpecName: "kube-api-access-x4xk4") pod "be214c51-9db0-4029-aed5-20b000e89c05" (UID: "be214c51-9db0-4029-aed5-20b000e89c05"). InnerVolumeSpecName "kube-api-access-x4xk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:37:16.892764 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.892724 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-oauth-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:16.892764 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.892759 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be214c51-9db0-4029-aed5-20b000e89c05-console-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:16.892764 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.892769 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-console-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:16.893003 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.892780 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be214c51-9db0-4029-aed5-20b000e89c05-service-ca\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:16.893003 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:16.892790 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4xk4\" (UniqueName: \"kubernetes.io/projected/be214c51-9db0-4029-aed5-20b000e89c05-kube-api-access-x4xk4\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:37:17.442997 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.442972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7578f8dd75-wkzk9_be214c51-9db0-4029-aed5-20b000e89c05/console/0.log" Apr 21 15:37:17.443203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.443014 2576 generic.go:358] "Generic (PLEG): container finished" podID="be214c51-9db0-4029-aed5-20b000e89c05" containerID="24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20" exitCode=2 Apr 21 15:37:17.443203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.443057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7578f8dd75-wkzk9" event={"ID":"be214c51-9db0-4029-aed5-20b000e89c05","Type":"ContainerDied","Data":"24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20"} Apr 21 15:37:17.443203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.443080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7578f8dd75-wkzk9" event={"ID":"be214c51-9db0-4029-aed5-20b000e89c05","Type":"ContainerDied","Data":"62c2cce603f4b3d61317a9bef5ef426a0442a0fa9050429713e08761d5b011c9"} Apr 21 15:37:17.443203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.443094 2576 scope.go:117] "RemoveContainer" containerID="24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20" Apr 21 15:37:17.443203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.443099 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7578f8dd75-wkzk9" Apr 21 15:37:17.452107 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.451952 2576 scope.go:117] "RemoveContainer" containerID="24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20" Apr 21 15:37:17.452356 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:37:17.452252 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20\": container with ID starting with 24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20 not found: ID does not exist" containerID="24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20" Apr 21 15:37:17.452356 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.452284 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20"} err="failed to get container status \"24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20\": rpc error: code = NotFound desc = could not find container \"24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20\": container with ID starting with 24a74df83694fc7c717ee813cecfeffcb3e840cb0220103dd7592928defbcb20 not found: ID does not exist" Apr 21 15:37:17.477955 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.477921 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7578f8dd75-wkzk9"] Apr 21 15:37:17.487296 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.487266 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7578f8dd75-wkzk9"] Apr 21 15:37:17.806054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:17.803789 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be214c51-9db0-4029-aed5-20b000e89c05" path="/var/lib/kubelet/pods/be214c51-9db0-4029-aed5-20b000e89c05/volumes" Apr 21 15:37:23.463755 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:23.463715 2576 generic.go:358] "Generic (PLEG): container finished" podID="934d29fe-8f2c-43c4-850e-f630d78f8e46" containerID="5ce09b69a0a622c437e4491e7360b6665a0c15e92bb0ebc1acbb811cb4000b76" exitCode=0 Apr 21 15:37:23.464148 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:23.463789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" event={"ID":"934d29fe-8f2c-43c4-850e-f630d78f8e46","Type":"ContainerDied","Data":"5ce09b69a0a622c437e4491e7360b6665a0c15e92bb0ebc1acbb811cb4000b76"} Apr 21 15:37:23.464148 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:23.464137 2576 scope.go:117] "RemoveContainer" containerID="5ce09b69a0a622c437e4491e7360b6665a0c15e92bb0ebc1acbb811cb4000b76" Apr 21 15:37:24.468428 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:37:24.468391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4c4f4" event={"ID":"934d29fe-8f2c-43c4-850e-f630d78f8e46","Type":"ContainerStarted","Data":"c7abf2aba0748aaf9ba07cda70e9b9d92910c4494cce23e27384c53ec981dcf3"} Apr 21 15:38:07.659316 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659216 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7dd9967dc7-r8jjk"] Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659701 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ab4e661-8ff7-460f-ab72-609d40571aad" containerName="console" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659721 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab4e661-8ff7-460f-ab72-609d40571aad" containerName="console" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659741 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="309490ab-206f-4ed9-9045-5effcdd68f2a" containerName="registry" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659749 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="309490ab-206f-4ed9-9045-5effcdd68f2a" containerName="registry" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659770 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be214c51-9db0-4029-aed5-20b000e89c05" containerName="console" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659780 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="be214c51-9db0-4029-aed5-20b000e89c05" containerName="console" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659858 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="be214c51-9db0-4029-aed5-20b000e89c05" containerName="console" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659872 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="309490ab-206f-4ed9-9045-5effcdd68f2a" containerName="registry" Apr 21 15:38:07.659890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.659881 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ab4e661-8ff7-460f-ab72-609d40571aad" containerName="console" Apr 21 15:38:07.662991 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.662973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.675545 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.675515 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dd9967dc7-r8jjk"] Apr 21 15:38:07.678846 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.678804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-console-config\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.678976 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.678863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-serving-cert\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.678976 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.678895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-service-ca\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.678976 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.678941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xmt\" (UniqueName: \"kubernetes.io/projected/8fef7932-6121-413e-bc03-e56177bd66ff-kube-api-access-78xmt\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.679077 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.678976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-oauth-config\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.679077 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.679004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-trusted-ca-bundle\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.679136 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.679084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-oauth-serving-cert\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779484 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78xmt\" (UniqueName: \"kubernetes.io/projected/8fef7932-6121-413e-bc03-e56177bd66ff-kube-api-access-78xmt\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779484 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-oauth-config\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779784 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-trusted-ca-bundle\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779784 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-oauth-serving-cert\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779784 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-console-config\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779929 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-serving-cert\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.779929 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.779825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-service-ca\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.780360 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.780332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-oauth-serving-cert\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.780515 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.780429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-service-ca\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.780627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.780606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-trusted-ca-bundle\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.780940 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.780918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-console-config\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.782268 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.782244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-serving-cert\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.782416 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.782394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-oauth-config\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.789507 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.789475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xmt\" (UniqueName: \"kubernetes.io/projected/8fef7932-6121-413e-bc03-e56177bd66ff-kube-api-access-78xmt\") pod \"console-7dd9967dc7-r8jjk\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:07.972537 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:07.972416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:08.111146 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:08.111097 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dd9967dc7-r8jjk"] Apr 21 15:38:08.113468 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:38:08.113432 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fef7932_6121_413e_bc03_e56177bd66ff.slice/crio-d852652ebb00b2ec60d528fc38c43f8c975c3039f39b3d56ac23c64c2e79ef37 WatchSource:0}: Error finding container d852652ebb00b2ec60d528fc38c43f8c975c3039f39b3d56ac23c64c2e79ef37: Status 404 returned error can't find the container with id d852652ebb00b2ec60d528fc38c43f8c975c3039f39b3d56ac23c64c2e79ef37 Apr 21 15:38:08.602401 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:08.602356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dd9967dc7-r8jjk" event={"ID":"8fef7932-6121-413e-bc03-e56177bd66ff","Type":"ContainerStarted","Data":"a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21"} Apr 21 15:38:08.602401 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:08.602397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dd9967dc7-r8jjk" event={"ID":"8fef7932-6121-413e-bc03-e56177bd66ff","Type":"ContainerStarted","Data":"d852652ebb00b2ec60d528fc38c43f8c975c3039f39b3d56ac23c64c2e79ef37"} Apr 21 15:38:08.625511 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:08.625422 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7dd9967dc7-r8jjk" podStartSLOduration=1.625404718 podStartE2EDuration="1.625404718s" podCreationTimestamp="2026-04-21 15:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:38:08.624617285 +0000 UTC m=+177.342694635" watchObservedRunningTime="2026-04-21 15:38:08.625404718 +0000 UTC m=+177.343482067" Apr 21 15:38:17.973415 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:17.973376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:17.973415 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:17.973421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:17.978167 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:17.978141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:18.639379 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:18.639349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:38:18.694983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:18.694952 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-849b4f55f7-wwzzh"] Apr 21 15:38:43.716065 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:43.716002 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-849b4f55f7-wwzzh" podUID="87358ab0-ee7e-4f54-850e-cc2bfc410aeb" containerName="console" containerID="cri-o://0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f" gracePeriod=15 Apr 21 15:38:43.950562 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:43.950536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-849b4f55f7-wwzzh_87358ab0-ee7e-4f54-850e-cc2bfc410aeb/console/0.log" Apr 21 15:38:43.950696 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:43.950599 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:38:44.095773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095739 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-config\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.095773 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095780 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp96w\" (UniqueName: \"kubernetes.io/projected/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-kube-api-access-bp96w\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.096040 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095813 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-service-ca\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.096040 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095844 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-oauth-serving-cert\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.096040 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095896 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-trusted-ca-bundle\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.096040 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095935 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-serving-cert\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.096040 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.095951 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-oauth-config\") pod \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\" (UID: \"87358ab0-ee7e-4f54-850e-cc2bfc410aeb\") " Apr 21 15:38:44.096397 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.096296 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:44.096397 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.096348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-service-ca" (OuterVolumeSpecName: "service-ca") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:44.096556 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.096419 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:44.096556 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.096441 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-config" (OuterVolumeSpecName: "console-config") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:44.098069 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.098040 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-kube-api-access-bp96w" (OuterVolumeSpecName: "kube-api-access-bp96w") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "kube-api-access-bp96w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:44.098069 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.098048 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:38:44.098202 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.098093 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "87358ab0-ee7e-4f54-850e-cc2bfc410aeb" (UID: "87358ab0-ee7e-4f54-850e-cc2bfc410aeb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:38:44.197417 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197371 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-service-ca\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.197417 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197409 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-oauth-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.197417 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197424 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-trusted-ca-bundle\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.197689 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197437 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.197689 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197449 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-oauth-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.197689 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197462 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-console-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.197689 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.197473 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bp96w\" (UniqueName: \"kubernetes.io/projected/87358ab0-ee7e-4f54-850e-cc2bfc410aeb-kube-api-access-bp96w\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:38:44.716591 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.716565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-849b4f55f7-wwzzh_87358ab0-ee7e-4f54-850e-cc2bfc410aeb/console/0.log" Apr 21 15:38:44.717027 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.716605 2576 generic.go:358] "Generic (PLEG): container finished" podID="87358ab0-ee7e-4f54-850e-cc2bfc410aeb" containerID="0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f" exitCode=2 Apr 21 15:38:44.717027 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.716639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849b4f55f7-wwzzh" event={"ID":"87358ab0-ee7e-4f54-850e-cc2bfc410aeb","Type":"ContainerDied","Data":"0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f"} Apr 21 15:38:44.717027 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.716660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849b4f55f7-wwzzh" event={"ID":"87358ab0-ee7e-4f54-850e-cc2bfc410aeb","Type":"ContainerDied","Data":"2a9ed7960c2dc5018b568b8767ff0d0e92efcaea75599ab9d438eba0b00d931e"} Apr 21 15:38:44.717027 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.716669 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849b4f55f7-wwzzh" Apr 21 15:38:44.717027 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.716675 2576 scope.go:117] "RemoveContainer" containerID="0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f" Apr 21 15:38:44.725075 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.725058 2576 scope.go:117] "RemoveContainer" containerID="0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f" Apr 21 15:38:44.725315 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:38:44.725297 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f\": container with ID starting with 0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f not found: ID does not exist" containerID="0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f" Apr 21 15:38:44.725384 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.725326 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f"} err="failed to get container status \"0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f\": rpc error: code = NotFound desc = could not find container \"0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f\": container with ID starting with 0f11faf0f93a6dd25b297aee884b08be2cf3da4943de2257e2fe3a0a05aa2a9f not found: ID does not exist" Apr 21 15:38:44.747507 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.747468 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-849b4f55f7-wwzzh"] Apr 21 15:38:44.758907 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:44.758880 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-849b4f55f7-wwzzh"] Apr 21 15:38:45.801915 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:38:45.801879 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87358ab0-ee7e-4f54-850e-cc2bfc410aeb" path="/var/lib/kubelet/pods/87358ab0-ee7e-4f54-850e-cc2bfc410aeb/volumes" Apr 21 15:39:27.112265 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.112185 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd"] Apr 21 15:39:27.112739 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.112572 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87358ab0-ee7e-4f54-850e-cc2bfc410aeb" containerName="console" Apr 21 15:39:27.112739 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.112586 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="87358ab0-ee7e-4f54-850e-cc2bfc410aeb" containerName="console" Apr 21 15:39:27.112739 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.112647 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="87358ab0-ee7e-4f54-850e-cc2bfc410aeb" containerName="console" Apr 21 15:39:27.115682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.115666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.118443 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.118420 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:39:27.118587 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.118445 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:39:27.119631 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.119613 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4t2d5\"" Apr 21 15:39:27.133141 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.133117 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd"] Apr 21 15:39:27.255673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.255627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.255673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.255678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4r7\" (UniqueName: \"kubernetes.io/projected/6b981019-86d3-40ec-99d3-eec91c6f2709-kube-api-access-qz4r7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.255910 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.255702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.356476 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.356424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4r7\" (UniqueName: \"kubernetes.io/projected/6b981019-86d3-40ec-99d3-eec91c6f2709-kube-api-access-qz4r7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.356476 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.356480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.356768 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.356586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.356854 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.356834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.356905 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.356860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.366383 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.366319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4r7\" (UniqueName: \"kubernetes.io/projected/6b981019-86d3-40ec-99d3-eec91c6f2709-kube-api-access-qz4r7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.425058 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.425023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:27.555295 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.555259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd"] Apr 21 15:39:27.559394 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:39:27.559364 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b981019_86d3_40ec_99d3_eec91c6f2709.slice/crio-f236487bf68688a81f871d653ee7b42ce0816ddd05bf7988b8302138226f2f24 WatchSource:0}: Error finding container f236487bf68688a81f871d653ee7b42ce0816ddd05bf7988b8302138226f2f24: Status 404 returned error can't find the container with id f236487bf68688a81f871d653ee7b42ce0816ddd05bf7988b8302138226f2f24 Apr 21 15:39:27.841087 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:27.841050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" event={"ID":"6b981019-86d3-40ec-99d3-eec91c6f2709","Type":"ContainerStarted","Data":"f236487bf68688a81f871d653ee7b42ce0816ddd05bf7988b8302138226f2f24"} Apr 21 15:39:32.859770 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:32.859733 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerID="1004ba87a9039e63797e0be3e1300df9d0cd349f850957fc8ec22cbf2249b22a" exitCode=0 Apr 21 15:39:32.860185 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:32.859786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" event={"ID":"6b981019-86d3-40ec-99d3-eec91c6f2709","Type":"ContainerDied","Data":"1004ba87a9039e63797e0be3e1300df9d0cd349f850957fc8ec22cbf2249b22a"} Apr 21 15:39:35.870228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:35.870195 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerID="b024f4a265a7d026fe93351ead836c6490bb2bb9b5db3a16996258f417bd6431" exitCode=0 Apr 21 15:39:35.870619 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:35.870262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" event={"ID":"6b981019-86d3-40ec-99d3-eec91c6f2709","Type":"ContainerDied","Data":"b024f4a265a7d026fe93351ead836c6490bb2bb9b5db3a16996258f417bd6431"} Apr 21 15:39:43.899520 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:43.899465 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerID="11c917f24499d347df032b5aed360b69b364c31ea54ef9dba6b64b37c58d936c" exitCode=0 Apr 21 15:39:43.899912 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:43.899552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" event={"ID":"6b981019-86d3-40ec-99d3-eec91c6f2709","Type":"ContainerDied","Data":"11c917f24499d347df032b5aed360b69b364c31ea54ef9dba6b64b37c58d936c"} Apr 21 15:39:45.025104 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.025081 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:45.123123 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.123089 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz4r7\" (UniqueName: \"kubernetes.io/projected/6b981019-86d3-40ec-99d3-eec91c6f2709-kube-api-access-qz4r7\") pod \"6b981019-86d3-40ec-99d3-eec91c6f2709\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " Apr 21 15:39:45.123319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.123157 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-bundle\") pod \"6b981019-86d3-40ec-99d3-eec91c6f2709\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " Apr 21 15:39:45.123319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.123180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-util\") pod \"6b981019-86d3-40ec-99d3-eec91c6f2709\" (UID: \"6b981019-86d3-40ec-99d3-eec91c6f2709\") " Apr 21 15:39:45.123739 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.123704 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-bundle" (OuterVolumeSpecName: "bundle") pod "6b981019-86d3-40ec-99d3-eec91c6f2709" (UID: "6b981019-86d3-40ec-99d3-eec91c6f2709"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:39:45.125302 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.125280 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b981019-86d3-40ec-99d3-eec91c6f2709-kube-api-access-qz4r7" (OuterVolumeSpecName: "kube-api-access-qz4r7") pod "6b981019-86d3-40ec-99d3-eec91c6f2709" (UID: "6b981019-86d3-40ec-99d3-eec91c6f2709"). InnerVolumeSpecName "kube-api-access-qz4r7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:39:45.127711 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.127692 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-util" (OuterVolumeSpecName: "util") pod "6b981019-86d3-40ec-99d3-eec91c6f2709" (UID: "6b981019-86d3-40ec-99d3-eec91c6f2709"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:39:45.224592 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.224476 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qz4r7\" (UniqueName: \"kubernetes.io/projected/6b981019-86d3-40ec-99d3-eec91c6f2709-kube-api-access-qz4r7\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:39:45.224592 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.224530 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-bundle\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:39:45.224592 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.224539 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b981019-86d3-40ec-99d3-eec91c6f2709-util\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:39:45.907732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.907690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" event={"ID":"6b981019-86d3-40ec-99d3-eec91c6f2709","Type":"ContainerDied","Data":"f236487bf68688a81f871d653ee7b42ce0816ddd05bf7988b8302138226f2f24"} Apr 21 15:39:45.907732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.907723 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f236487bf68688a81f871d653ee7b42ce0816ddd05bf7988b8302138226f2f24" Apr 21 15:39:45.907964 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:45.907756 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cccgwd" Apr 21 15:39:49.793557 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.793516 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6"] Apr 21 15:39:49.794034 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794016 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="pull" Apr 21 15:39:49.794103 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794036 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="pull" Apr 21 15:39:49.794103 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794068 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="util" Apr 21 15:39:49.794103 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794077 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="util" Apr 21 15:39:49.794103 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794092 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="extract" Apr 21 15:39:49.794103 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794102 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="extract" Apr 21 15:39:49.794343 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.794178 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b981019-86d3-40ec-99d3-eec91c6f2709" containerName="extract" Apr 21 15:39:49.801386 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.801354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:49.804286 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.804254 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 15:39:49.804424 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.804337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-tnjwm\"" Apr 21 15:39:49.805101 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.805080 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 15:39:49.805227 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.805170 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 15:39:49.808656 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.808635 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6"] Apr 21 15:39:49.963367 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.963321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcr65\" (UniqueName: \"kubernetes.io/projected/272089c2-27bc-4bb8-a4a5-0a55837d6002-kube-api-access-lcr65\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5njx6\" (UID: \"272089c2-27bc-4bb8-a4a5-0a55837d6002\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:49.963596 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:49.963451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/272089c2-27bc-4bb8-a4a5-0a55837d6002-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5njx6\" (UID: \"272089c2-27bc-4bb8-a4a5-0a55837d6002\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:50.064494 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.064460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/272089c2-27bc-4bb8-a4a5-0a55837d6002-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5njx6\" (UID: \"272089c2-27bc-4bb8-a4a5-0a55837d6002\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:50.064660 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.064555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcr65\" (UniqueName: \"kubernetes.io/projected/272089c2-27bc-4bb8-a4a5-0a55837d6002-kube-api-access-lcr65\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5njx6\" (UID: \"272089c2-27bc-4bb8-a4a5-0a55837d6002\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:50.066925 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.066901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/272089c2-27bc-4bb8-a4a5-0a55837d6002-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5njx6\" (UID: \"272089c2-27bc-4bb8-a4a5-0a55837d6002\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:50.075262 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.075236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcr65\" (UniqueName: \"kubernetes.io/projected/272089c2-27bc-4bb8-a4a5-0a55837d6002-kube-api-access-lcr65\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5njx6\" (UID: \"272089c2-27bc-4bb8-a4a5-0a55837d6002\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:50.113234 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.113190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:50.245664 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.245633 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6"] Apr 21 15:39:50.248479 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:39:50.248450 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod272089c2_27bc_4bb8_a4a5_0a55837d6002.slice/crio-fd2d212cdd797fa87377183f9de42f3fbbc50c7f25811488eccc0e997956550b WatchSource:0}: Error finding container fd2d212cdd797fa87377183f9de42f3fbbc50c7f25811488eccc0e997956550b: Status 404 returned error can't find the container with id fd2d212cdd797fa87377183f9de42f3fbbc50c7f25811488eccc0e997956550b Apr 21 15:39:50.923922 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:50.923882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" event={"ID":"272089c2-27bc-4bb8-a4a5-0a55837d6002","Type":"ContainerStarted","Data":"fd2d212cdd797fa87377183f9de42f3fbbc50c7f25811488eccc0e997956550b"} Apr 21 15:39:55.735742 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.735698 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-wr4r8"] Apr 21 15:39:55.739277 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.739251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.758705 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.758674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 15:39:55.761578 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.761550 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 21 15:39:55.766350 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.766325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8twzb\"" Apr 21 15:39:55.767881 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.767852 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-wr4r8"] Apr 21 15:39:55.811546 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.811502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/29645320-2941-4f2c-9898-10778bf22857-cabundle0\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.811812 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.811753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.811919 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.811887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htl9d\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-kube-api-access-htl9d\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.912326 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.912284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.912576 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.912349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htl9d\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-kube-api-access-htl9d\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.912576 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.912372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/29645320-2941-4f2c-9898-10778bf22857-cabundle0\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.912576 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:55.912441 2576 secret.go:281] references non-existent secret key: ca.crt Apr 21 15:39:55.912576 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:55.912462 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 15:39:55.912576 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:55.912472 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-wr4r8: references non-existent secret key: ca.crt Apr 21 15:39:55.912576 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:55.912557 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates podName:29645320-2941-4f2c-9898-10778bf22857 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:56.412540886 +0000 UTC m=+285.130618211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates") pod "keda-operator-ffbb595cb-wr4r8" (UID: "29645320-2941-4f2c-9898-10778bf22857") : references non-existent secret key: ca.crt Apr 21 15:39:55.912984 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.912964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/29645320-2941-4f2c-9898-10778bf22857-cabundle0\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.925381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.925352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htl9d\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-kube-api-access-htl9d\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:55.946469 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.946433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" event={"ID":"272089c2-27bc-4bb8-a4a5-0a55837d6002","Type":"ContainerStarted","Data":"205307e67c64e60511416f265005e1319aca906d299fe3ff87590235d6da5cd2"} Apr 21 15:39:55.946649 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.946563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:39:55.980692 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:55.980638 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" podStartSLOduration=2.194949854 podStartE2EDuration="6.980616249s" podCreationTimestamp="2026-04-21 15:39:49 +0000 UTC" firstStartedPulling="2026-04-21 15:39:50.250342385 +0000 UTC m=+278.968419715" lastFinishedPulling="2026-04-21 15:39:55.036008783 +0000 UTC m=+283.754086110" observedRunningTime="2026-04-21 15:39:55.979617266 +0000 UTC m=+284.697694636" watchObservedRunningTime="2026-04-21 15:39:55.980616249 +0000 UTC m=+284.698693596" Apr 21 15:39:56.164734 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.164697 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt"] Apr 21 15:39:56.168129 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.168112 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.171408 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.171385 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 21 15:39:56.183783 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.183757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt"] Apr 21 15:39:56.214862 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.214825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6b259eca-069b-497d-8860-a863b6bca0d8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.215021 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.214873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.215021 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.214968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7dtk\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-kube-api-access-v7dtk\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.315386 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.315348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7dtk\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-kube-api-access-v7dtk\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.315586 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.315409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6b259eca-069b-497d-8860-a863b6bca0d8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.315586 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.315435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.315586 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.315574 2576 secret.go:281] references non-existent secret key: tls.crt Apr 21 15:39:56.315586 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.315586 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 15:39:56.315744 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.315600 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 21 15:39:56.315744 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.315621 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 15:39:56.315744 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.315678 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates podName:6b259eca-069b-497d-8860-a863b6bca0d8 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:56.815664385 +0000 UTC m=+285.533741711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates") pod "keda-metrics-apiserver-7c9f485588-ss9jt" (UID: "6b259eca-069b-497d-8860-a863b6bca0d8") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 15:39:56.315900 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.315880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6b259eca-069b-497d-8860-a863b6bca0d8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.341024 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.340986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7dtk\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-kube-api-access-v7dtk\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.417030 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.416936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:56.417163 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.417076 2576 secret.go:281] references non-existent secret key: ca.crt Apr 21 15:39:56.417163 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.417095 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 15:39:56.417163 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.417105 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-wr4r8: references non-existent secret key: ca.crt Apr 21 15:39:56.417163 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.417154 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates podName:29645320-2941-4f2c-9898-10778bf22857 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:57.417140247 +0000 UTC m=+286.135217573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates") pod "keda-operator-ffbb595cb-wr4r8" (UID: "29645320-2941-4f2c-9898-10778bf22857") : references non-existent secret key: ca.crt Apr 21 15:39:56.499485 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.499444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-cr6pg"] Apr 21 15:39:56.503411 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.503389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.507708 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.507681 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 21 15:39:56.517664 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.517643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k88t\" (UniqueName: \"kubernetes.io/projected/7467aa68-547b-46f8-b5ed-7989bae62b8d-kube-api-access-5k88t\") pod \"keda-admission-cf49989db-cr6pg\" (UID: \"7467aa68-547b-46f8-b5ed-7989bae62b8d\") " pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.517745 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.517692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7467aa68-547b-46f8-b5ed-7989bae62b8d-certificates\") pod \"keda-admission-cf49989db-cr6pg\" (UID: \"7467aa68-547b-46f8-b5ed-7989bae62b8d\") " pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.528580 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.528555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-cr6pg"] Apr 21 15:39:56.618512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.618455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k88t\" (UniqueName: \"kubernetes.io/projected/7467aa68-547b-46f8-b5ed-7989bae62b8d-kube-api-access-5k88t\") pod \"keda-admission-cf49989db-cr6pg\" (UID: \"7467aa68-547b-46f8-b5ed-7989bae62b8d\") " pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.618695 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.618573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7467aa68-547b-46f8-b5ed-7989bae62b8d-certificates\") pod \"keda-admission-cf49989db-cr6pg\" (UID: \"7467aa68-547b-46f8-b5ed-7989bae62b8d\") " pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.622074 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.622049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7467aa68-547b-46f8-b5ed-7989bae62b8d-certificates\") pod \"keda-admission-cf49989db-cr6pg\" (UID: \"7467aa68-547b-46f8-b5ed-7989bae62b8d\") " pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.638452 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.638426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k88t\" (UniqueName: \"kubernetes.io/projected/7467aa68-547b-46f8-b5ed-7989bae62b8d-kube-api-access-5k88t\") pod \"keda-admission-cf49989db-cr6pg\" (UID: \"7467aa68-547b-46f8-b5ed-7989bae62b8d\") " pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.813999 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.813963 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:39:56.820214 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.820128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:56.820350 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.820244 2576 secret.go:281] references non-existent secret key: tls.crt Apr 21 15:39:56.820350 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.820264 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 15:39:56.820350 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.820290 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt: references non-existent secret key: tls.crt Apr 21 15:39:56.820529 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:56.820354 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates podName:6b259eca-069b-497d-8860-a863b6bca0d8 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:57.820334514 +0000 UTC m=+286.538411857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates") pod "keda-metrics-apiserver-7c9f485588-ss9jt" (UID: "6b259eca-069b-497d-8860-a863b6bca0d8") : references non-existent secret key: tls.crt Apr 21 15:39:56.964464 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:56.964429 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-cr6pg"] Apr 21 15:39:56.969762 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:39:56.969719 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7467aa68_547b_46f8_b5ed_7989bae62b8d.slice/crio-ba1ee4f4eaea963d5a471a4d8062d1a48fb8552d36dbf8e3b7918563dd866c8a WatchSource:0}: Error finding container ba1ee4f4eaea963d5a471a4d8062d1a48fb8552d36dbf8e3b7918563dd866c8a: Status 404 returned error can't find the container with id ba1ee4f4eaea963d5a471a4d8062d1a48fb8552d36dbf8e3b7918563dd866c8a Apr 21 15:39:57.426183 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:57.426144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:57.426389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.426280 2576 secret.go:281] references non-existent secret key: ca.crt Apr 21 15:39:57.426389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.426296 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 15:39:57.426389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.426305 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-wr4r8: references non-existent secret key: ca.crt Apr 21 15:39:57.426389 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.426357 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates podName:29645320-2941-4f2c-9898-10778bf22857 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:59.426342976 +0000 UTC m=+288.144420302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates") pod "keda-operator-ffbb595cb-wr4r8" (UID: "29645320-2941-4f2c-9898-10778bf22857") : references non-existent secret key: ca.crt Apr 21 15:39:57.829323 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:57.829283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:57.829770 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.829371 2576 secret.go:281] references non-existent secret key: tls.crt Apr 21 15:39:57.829770 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.829394 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 15:39:57.829770 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.829416 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt: references non-existent secret key: tls.crt Apr 21 15:39:57.829770 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:57.829481 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates podName:6b259eca-069b-497d-8860-a863b6bca0d8 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:59.829458495 +0000 UTC m=+288.547535823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates") pod "keda-metrics-apiserver-7c9f485588-ss9jt" (UID: "6b259eca-069b-497d-8860-a863b6bca0d8") : references non-existent secret key: tls.crt Apr 21 15:39:57.955864 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:57.955827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-cr6pg" event={"ID":"7467aa68-547b-46f8-b5ed-7989bae62b8d","Type":"ContainerStarted","Data":"ba1ee4f4eaea963d5a471a4d8062d1a48fb8552d36dbf8e3b7918563dd866c8a"} Apr 21 15:39:59.441351 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:59.441309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:39:59.441747 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:59.441463 2576 secret.go:281] references non-existent secret key: ca.crt Apr 21 15:39:59.441747 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:59.441480 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 15:39:59.441747 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:59.441513 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-wr4r8: references non-existent secret key: ca.crt Apr 21 15:39:59.441747 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:39:59.441579 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates podName:29645320-2941-4f2c-9898-10778bf22857 nodeName:}" failed. No retries permitted until 2026-04-21 15:40:03.441561142 +0000 UTC m=+292.159638468 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates") pod "keda-operator-ffbb595cb-wr4r8" (UID: "29645320-2941-4f2c-9898-10778bf22857") : references non-existent secret key: ca.crt Apr 21 15:39:59.845183 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:59.845135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:39:59.848151 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:39:59.848121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6b259eca-069b-497d-8860-a863b6bca0d8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ss9jt\" (UID: \"6b259eca-069b-497d-8860-a863b6bca0d8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:40:00.078877 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:00.078837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:40:00.220295 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:00.220264 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt"] Apr 21 15:40:00.447681 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:40:00.447608 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b259eca_069b_497d_8860_a863b6bca0d8.slice/crio-0375f0ed7ba45e0d1dce4eeb9d25d8809009ce81cb8e98ff829d02b6e0c787a3 WatchSource:0}: Error finding container 0375f0ed7ba45e0d1dce4eeb9d25d8809009ce81cb8e98ff829d02b6e0c787a3: Status 404 returned error can't find the container with id 0375f0ed7ba45e0d1dce4eeb9d25d8809009ce81cb8e98ff829d02b6e0c787a3 Apr 21 15:40:00.967892 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:00.967856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-cr6pg" event={"ID":"7467aa68-547b-46f8-b5ed-7989bae62b8d","Type":"ContainerStarted","Data":"98a0ac1a26e062cd2abee78e1c9fc5070d09e574c356f324ca4c0c1c18d065d7"} Apr 21 15:40:00.968094 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:00.968001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:40:00.968916 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:00.968893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" event={"ID":"6b259eca-069b-497d-8860-a863b6bca0d8","Type":"ContainerStarted","Data":"0375f0ed7ba45e0d1dce4eeb9d25d8809009ce81cb8e98ff829d02b6e0c787a3"} Apr 21 15:40:00.987821 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:00.987774 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-cr6pg" podStartSLOduration=1.449813285 podStartE2EDuration="4.987759772s" podCreationTimestamp="2026-04-21 15:39:56 +0000 UTC" firstStartedPulling="2026-04-21 15:39:56.972644534 +0000 UTC m=+285.690721864" lastFinishedPulling="2026-04-21 15:40:00.510591014 +0000 UTC m=+289.228668351" observedRunningTime="2026-04-21 15:40:00.986927997 +0000 UTC m=+289.705005342" watchObservedRunningTime="2026-04-21 15:40:00.987759772 +0000 UTC m=+289.705837119" Apr 21 15:40:03.478675 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.478572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:40:03.481026 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.481003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29645320-2941-4f2c-9898-10778bf22857-certificates\") pod \"keda-operator-ffbb595cb-wr4r8\" (UID: \"29645320-2941-4f2c-9898-10778bf22857\") " pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:40:03.549352 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.549300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:40:03.682894 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.682858 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-wr4r8"] Apr 21 15:40:03.686318 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:40:03.686285 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29645320_2941_4f2c_9898_10778bf22857.slice/crio-e2ce775a665f17adcdc042d3c2b1149697ad3b60080969a48755684f68f2a476 WatchSource:0}: Error finding container e2ce775a665f17adcdc042d3c2b1149697ad3b60080969a48755684f68f2a476: Status 404 returned error can't find the container with id e2ce775a665f17adcdc042d3c2b1149697ad3b60080969a48755684f68f2a476 Apr 21 15:40:03.981301 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.981261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" event={"ID":"6b259eca-069b-497d-8860-a863b6bca0d8","Type":"ContainerStarted","Data":"b37ee46b0a94b8bd4944a338d966aa66f851b25f9e6543889432a8ff2d2a310d"} Apr 21 15:40:03.981527 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.981390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:40:03.982313 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:03.982287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" event={"ID":"29645320-2941-4f2c-9898-10778bf22857","Type":"ContainerStarted","Data":"e2ce775a665f17adcdc042d3c2b1149697ad3b60080969a48755684f68f2a476"} Apr 21 15:40:04.003066 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:04.003016 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" podStartSLOduration=5.300509111 podStartE2EDuration="8.002998583s" podCreationTimestamp="2026-04-21 15:39:56 +0000 UTC" firstStartedPulling="2026-04-21 15:40:00.449178111 +0000 UTC m=+289.167255451" lastFinishedPulling="2026-04-21 15:40:03.151667597 +0000 UTC m=+291.869744923" observedRunningTime="2026-04-21 15:40:04.001257574 +0000 UTC m=+292.719334942" watchObservedRunningTime="2026-04-21 15:40:04.002998583 +0000 UTC m=+292.721075931" Apr 21 15:40:07.995948 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:07.995914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" event={"ID":"29645320-2941-4f2c-9898-10778bf22857","Type":"ContainerStarted","Data":"6e3b986359d7ba2c03aea9b46895d754afbb2bb194e1203f80abeae759bb1d65"} Apr 21 15:40:07.996517 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:07.996041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:40:08.025070 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:08.025014 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" podStartSLOduration=9.56211224 podStartE2EDuration="13.024998021s" podCreationTimestamp="2026-04-21 15:39:55 +0000 UTC" firstStartedPulling="2026-04-21 15:40:03.687792114 +0000 UTC m=+292.405869440" lastFinishedPulling="2026-04-21 15:40:07.150677892 +0000 UTC m=+295.868755221" observedRunningTime="2026-04-21 15:40:08.022312468 +0000 UTC m=+296.740389815" watchObservedRunningTime="2026-04-21 15:40:08.024998021 +0000 UTC m=+296.743075368" Apr 21 15:40:11.726018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:11.725990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:40:11.726456 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:11.726046 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:40:11.738134 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:11.738101 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:40:14.990270 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:14.990236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ss9jt" Apr 21 15:40:16.954160 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:16.954121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5njx6" Apr 21 15:40:21.975442 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:21.975411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-cr6pg" Apr 21 15:40:29.001930 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:40:29.001898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-wr4r8" Apr 21 15:41:12.604852 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.604765 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-z5hwj"] Apr 21 15:41:12.607352 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.607330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.610300 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.610277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 21 15:41:12.611369 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.611348 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 15:41:12.611472 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.611369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-s4rzm\"" Apr 21 15:41:12.611472 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.611358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 15:41:12.619204 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.619181 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-z5hwj"] Apr 21 15:41:12.641106 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.641074 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-ngqsk"] Apr 21 15:41:12.643811 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.643794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.646877 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.646852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 15:41:12.646997 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.646852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-7hhth\"" Apr 21 15:41:12.653386 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.653360 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ngqsk"] Apr 21 15:41:12.682981 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.682944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/761953ed-095c-4ca4-bf9b-32602f9feeef-cert\") pod \"kserve-controller-manager-9c85dd4d8-z5hwj\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.683193 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.683090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2f7\" (UniqueName: \"kubernetes.io/projected/761953ed-095c-4ca4-bf9b-32602f9feeef-kube-api-access-jc2f7\") pod \"kserve-controller-manager-9c85dd4d8-z5hwj\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.783582 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.783538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2f7\" (UniqueName: \"kubernetes.io/projected/761953ed-095c-4ca4-bf9b-32602f9feeef-kube-api-access-jc2f7\") pod \"kserve-controller-manager-9c85dd4d8-z5hwj\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.783582 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.783583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/70dc397b-75b1-4c2c-9810-89227545be76-data\") pod \"seaweedfs-86cc847c5c-ngqsk\" (UID: \"70dc397b-75b1-4c2c-9810-89227545be76\") " pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.783824 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.783623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/761953ed-095c-4ca4-bf9b-32602f9feeef-cert\") pod \"kserve-controller-manager-9c85dd4d8-z5hwj\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.783824 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.783657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmcb\" (UniqueName: \"kubernetes.io/projected/70dc397b-75b1-4c2c-9810-89227545be76-kube-api-access-cwmcb\") pod \"seaweedfs-86cc847c5c-ngqsk\" (UID: \"70dc397b-75b1-4c2c-9810-89227545be76\") " pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.786163 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.786136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/761953ed-095c-4ca4-bf9b-32602f9feeef-cert\") pod \"kserve-controller-manager-9c85dd4d8-z5hwj\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.796226 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.796195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2f7\" (UniqueName: \"kubernetes.io/projected/761953ed-095c-4ca4-bf9b-32602f9feeef-kube-api-access-jc2f7\") pod \"kserve-controller-manager-9c85dd4d8-z5hwj\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.884713 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.884612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmcb\" (UniqueName: \"kubernetes.io/projected/70dc397b-75b1-4c2c-9810-89227545be76-kube-api-access-cwmcb\") pod \"seaweedfs-86cc847c5c-ngqsk\" (UID: \"70dc397b-75b1-4c2c-9810-89227545be76\") " pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.884713 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.884695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/70dc397b-75b1-4c2c-9810-89227545be76-data\") pod \"seaweedfs-86cc847c5c-ngqsk\" (UID: \"70dc397b-75b1-4c2c-9810-89227545be76\") " pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.885034 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.885017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/70dc397b-75b1-4c2c-9810-89227545be76-data\") pod \"seaweedfs-86cc847c5c-ngqsk\" (UID: \"70dc397b-75b1-4c2c-9810-89227545be76\") " pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.895961 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.895930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmcb\" (UniqueName: \"kubernetes.io/projected/70dc397b-75b1-4c2c-9810-89227545be76-kube-api-access-cwmcb\") pod \"seaweedfs-86cc847c5c-ngqsk\" (UID: \"70dc397b-75b1-4c2c-9810-89227545be76\") " pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:12.918828 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.918783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:12.954216 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:12.954183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:13.053834 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:13.053796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-z5hwj"] Apr 21 15:41:13.057128 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:41:13.057097 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761953ed_095c_4ca4_bf9b_32602f9feeef.slice/crio-01265d3471fa8f2e3d64e91998309fb29c9180681aa2fe3956334a50592a6652 WatchSource:0}: Error finding container 01265d3471fa8f2e3d64e91998309fb29c9180681aa2fe3956334a50592a6652: Status 404 returned error can't find the container with id 01265d3471fa8f2e3d64e91998309fb29c9180681aa2fe3956334a50592a6652 Apr 21 15:41:13.058923 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:13.058902 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:41:13.101152 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:13.101066 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ngqsk"] Apr 21 15:41:13.103346 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:41:13.103314 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70dc397b_75b1_4c2c_9810_89227545be76.slice/crio-49b70b19d52bea47a2438222c4ebc2095b217f414d3e91fb7e58e1db5d7ea0cb WatchSource:0}: Error finding container 49b70b19d52bea47a2438222c4ebc2095b217f414d3e91fb7e58e1db5d7ea0cb: Status 404 returned error can't find the container with id 49b70b19d52bea47a2438222c4ebc2095b217f414d3e91fb7e58e1db5d7ea0cb Apr 21 15:41:13.211349 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:13.211264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ngqsk" event={"ID":"70dc397b-75b1-4c2c-9810-89227545be76","Type":"ContainerStarted","Data":"49b70b19d52bea47a2438222c4ebc2095b217f414d3e91fb7e58e1db5d7ea0cb"} Apr 21 15:41:13.212375 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:13.212348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" event={"ID":"761953ed-095c-4ca4-bf9b-32602f9feeef","Type":"ContainerStarted","Data":"01265d3471fa8f2e3d64e91998309fb29c9180681aa2fe3956334a50592a6652"} Apr 21 15:41:17.232508 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:17.232456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ngqsk" event={"ID":"70dc397b-75b1-4c2c-9810-89227545be76","Type":"ContainerStarted","Data":"ea2963f79159b2fc8df285feceac0afdb4a27ad7d53e4204b6e71123580faf65"} Apr 21 15:41:17.232970 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:17.232524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:17.233790 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:17.233765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" event={"ID":"761953ed-095c-4ca4-bf9b-32602f9feeef","Type":"ContainerStarted","Data":"f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561"} Apr 21 15:41:17.233925 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:17.233871 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:17.257787 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:17.257739 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-ngqsk" podStartSLOduration=1.7104297960000001 podStartE2EDuration="5.257727202s" podCreationTimestamp="2026-04-21 15:41:12 +0000 UTC" firstStartedPulling="2026-04-21 15:41:13.104621812 +0000 UTC m=+361.822699139" lastFinishedPulling="2026-04-21 15:41:16.651919203 +0000 UTC m=+365.369996545" observedRunningTime="2026-04-21 15:41:17.256902234 +0000 UTC m=+365.974979595" watchObservedRunningTime="2026-04-21 15:41:17.257727202 +0000 UTC m=+365.975804550" Apr 21 15:41:17.291172 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:17.291119 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" podStartSLOduration=1.7689984619999999 podStartE2EDuration="5.291103711s" podCreationTimestamp="2026-04-21 15:41:12 +0000 UTC" firstStartedPulling="2026-04-21 15:41:13.05906904 +0000 UTC m=+361.777146369" lastFinishedPulling="2026-04-21 15:41:16.581174287 +0000 UTC m=+365.299251618" observedRunningTime="2026-04-21 15:41:17.289945143 +0000 UTC m=+366.008022502" watchObservedRunningTime="2026-04-21 15:41:17.291103711 +0000 UTC m=+366.009181058" Apr 21 15:41:23.239276 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:23.239242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-ngqsk" Apr 21 15:41:47.371972 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.371936 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56bc4fbb44-m7gsp"] Apr 21 15:41:47.375665 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.375633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.400665 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.400628 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56bc4fbb44-m7gsp"] Apr 21 15:41:47.490810 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.490770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-trusted-ca-bundle\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.491013 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.490839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ebb80ef-7217-42d0-9955-a96caaee251b-console-oauth-config\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.491013 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.490903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ebb80ef-7217-42d0-9955-a96caaee251b-console-serving-cert\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.491013 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.490921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-oauth-serving-cert\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.491013 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.490940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-console-config\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.491221 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.491016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-service-ca\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.491221 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.491062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5qt\" (UniqueName: \"kubernetes.io/projected/1ebb80ef-7217-42d0-9955-a96caaee251b-kube-api-access-jk5qt\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ebb80ef-7217-42d0-9955-a96caaee251b-console-serving-cert\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-oauth-serving-cert\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592646 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-console-config\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592646 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-service-ca\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592646 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5qt\" (UniqueName: \"kubernetes.io/projected/1ebb80ef-7217-42d0-9955-a96caaee251b-kube-api-access-jk5qt\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592646 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-trusted-ca-bundle\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.592646 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.592630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ebb80ef-7217-42d0-9955-a96caaee251b-console-oauth-config\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.593150 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.593126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-console-config\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.593275 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.593159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-service-ca\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.593371 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.593350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-oauth-serving-cert\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.593521 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.593479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ebb80ef-7217-42d0-9955-a96caaee251b-trusted-ca-bundle\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.594975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.594953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ebb80ef-7217-42d0-9955-a96caaee251b-console-oauth-config\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.595129 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.595111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ebb80ef-7217-42d0-9955-a96caaee251b-console-serving-cert\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.604083 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.604054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5qt\" (UniqueName: \"kubernetes.io/projected/1ebb80ef-7217-42d0-9955-a96caaee251b-kube-api-access-jk5qt\") pod \"console-56bc4fbb44-m7gsp\" (UID: \"1ebb80ef-7217-42d0-9955-a96caaee251b\") " pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.685102 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.685009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:47.819551 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:47.819525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56bc4fbb44-m7gsp"] Apr 21 15:41:47.822208 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:41:47.822178 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ebb80ef_7217_42d0_9955_a96caaee251b.slice/crio-fa2625b654cfe250aecc47ae6c4c071c3572a8eeb1b31279b68ffc20fbe9387d WatchSource:0}: Error finding container fa2625b654cfe250aecc47ae6c4c071c3572a8eeb1b31279b68ffc20fbe9387d: Status 404 returned error can't find the container with id fa2625b654cfe250aecc47ae6c4c071c3572a8eeb1b31279b68ffc20fbe9387d Apr 21 15:41:48.242356 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:48.242326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:48.341086 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:48.341050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56bc4fbb44-m7gsp" event={"ID":"1ebb80ef-7217-42d0-9955-a96caaee251b","Type":"ContainerStarted","Data":"76640f8cdd18fb6b5d27c6344ad2c6f3b3c93e7f3c1bc974eb5ca39ae5934f2e"} Apr 21 15:41:48.341086 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:48.341091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56bc4fbb44-m7gsp" event={"ID":"1ebb80ef-7217-42d0-9955-a96caaee251b","Type":"ContainerStarted","Data":"fa2625b654cfe250aecc47ae6c4c071c3572a8eeb1b31279b68ffc20fbe9387d"} Apr 21 15:41:48.385635 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:48.385581 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56bc4fbb44-m7gsp" podStartSLOduration=1.385562628 podStartE2EDuration="1.385562628s" podCreationTimestamp="2026-04-21 15:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:41:48.38378395 +0000 UTC m=+397.101861297" watchObservedRunningTime="2026-04-21 15:41:48.385562628 +0000 UTC m=+397.103639976" Apr 21 15:41:48.974512 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:48.974413 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-z5hwj"] Apr 21 15:41:48.974683 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:48.974657 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" podUID="761953ed-095c-4ca4-bf9b-32602f9feeef" containerName="manager" containerID="cri-o://f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561" gracePeriod=10 Apr 21 15:41:49.218310 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.218284 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:49.308296 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.308268 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/761953ed-095c-4ca4-bf9b-32602f9feeef-cert\") pod \"761953ed-095c-4ca4-bf9b-32602f9feeef\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " Apr 21 15:41:49.308441 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.308363 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc2f7\" (UniqueName: \"kubernetes.io/projected/761953ed-095c-4ca4-bf9b-32602f9feeef-kube-api-access-jc2f7\") pod \"761953ed-095c-4ca4-bf9b-32602f9feeef\" (UID: \"761953ed-095c-4ca4-bf9b-32602f9feeef\") " Apr 21 15:41:49.310481 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.310449 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761953ed-095c-4ca4-bf9b-32602f9feeef-cert" (OuterVolumeSpecName: "cert") pod "761953ed-095c-4ca4-bf9b-32602f9feeef" (UID: "761953ed-095c-4ca4-bf9b-32602f9feeef"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:41:49.310605 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.310509 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761953ed-095c-4ca4-bf9b-32602f9feeef-kube-api-access-jc2f7" (OuterVolumeSpecName: "kube-api-access-jc2f7") pod "761953ed-095c-4ca4-bf9b-32602f9feeef" (UID: "761953ed-095c-4ca4-bf9b-32602f9feeef"). InnerVolumeSpecName "kube-api-access-jc2f7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:41:49.345977 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.345944 2576 generic.go:358] "Generic (PLEG): container finished" podID="761953ed-095c-4ca4-bf9b-32602f9feeef" containerID="f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561" exitCode=0 Apr 21 15:41:49.346138 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.346013 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" Apr 21 15:41:49.346138 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.346030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" event={"ID":"761953ed-095c-4ca4-bf9b-32602f9feeef","Type":"ContainerDied","Data":"f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561"} Apr 21 15:41:49.346138 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.346067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-z5hwj" event={"ID":"761953ed-095c-4ca4-bf9b-32602f9feeef","Type":"ContainerDied","Data":"01265d3471fa8f2e3d64e91998309fb29c9180681aa2fe3956334a50592a6652"} Apr 21 15:41:49.346138 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.346083 2576 scope.go:117] "RemoveContainer" containerID="f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561" Apr 21 15:41:49.355145 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.355126 2576 scope.go:117] "RemoveContainer" containerID="f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561" Apr 21 15:41:49.355440 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:41:49.355419 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561\": container with ID starting with f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561 not found: ID does not exist" containerID="f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561" Apr 21 15:41:49.355540 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.355448 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561"} err="failed to get container status \"f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561\": rpc error: code = NotFound desc = could not find container \"f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561\": container with ID starting with f5d343d57318079d24b05a4aedaa1741c649745f3be1abc64cf11f21d1937561 not found: ID does not exist" Apr 21 15:41:49.371607 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.371579 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-z5hwj"] Apr 21 15:41:49.381411 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.381387 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-z5hwj"] Apr 21 15:41:49.410369 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.410182 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jc2f7\" (UniqueName: \"kubernetes.io/projected/761953ed-095c-4ca4-bf9b-32602f9feeef-kube-api-access-jc2f7\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:41:49.410369 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.410213 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/761953ed-095c-4ca4-bf9b-32602f9feeef-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:41:49.802231 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:49.802195 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761953ed-095c-4ca4-bf9b-32602f9feeef" path="/var/lib/kubelet/pods/761953ed-095c-4ca4-bf9b-32602f9feeef/volumes" Apr 21 15:41:57.685983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:57.685945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:57.686572 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:57.686086 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:57.691177 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:57.691154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:58.381535 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:58.381505 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56bc4fbb44-m7gsp" Apr 21 15:41:58.471337 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:41:58.471299 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7dd9967dc7-r8jjk"] Apr 21 15:42:23.490791 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.490674 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7dd9967dc7-r8jjk" podUID="8fef7932-6121-413e-bc03-e56177bd66ff" containerName="console" containerID="cri-o://a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21" gracePeriod=15 Apr 21 15:42:23.727084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.727059 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7dd9967dc7-r8jjk_8fef7932-6121-413e-bc03-e56177bd66ff/console/0.log" Apr 21 15:42:23.727229 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.727125 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:42:23.806534 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-oauth-config\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.806699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806616 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-serving-cert\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.806699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806655 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-trusted-ca-bundle\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.806699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806679 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-service-ca\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.806926 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806707 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-oauth-serving-cert\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.806926 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806738 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78xmt\" (UniqueName: \"kubernetes.io/projected/8fef7932-6121-413e-bc03-e56177bd66ff-kube-api-access-78xmt\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.806926 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.806773 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-console-config\") pod \"8fef7932-6121-413e-bc03-e56177bd66ff\" (UID: \"8fef7932-6121-413e-bc03-e56177bd66ff\") " Apr 21 15:42:23.807097 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.807056 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-service-ca" (OuterVolumeSpecName: "service-ca") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:42:23.807097 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.807083 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:42:23.807452 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.807427 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-console-config" (OuterVolumeSpecName: "console-config") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:42:23.807591 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.807559 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:42:23.808806 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.808774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:42:23.808901 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.808836 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:42:23.809094 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.809065 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fef7932-6121-413e-bc03-e56177bd66ff-kube-api-access-78xmt" (OuterVolumeSpecName: "kube-api-access-78xmt") pod "8fef7932-6121-413e-bc03-e56177bd66ff" (UID: "8fef7932-6121-413e-bc03-e56177bd66ff"). InnerVolumeSpecName "kube-api-access-78xmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:42:23.907720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907674 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-oauth-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:23.907720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907712 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fef7932-6121-413e-bc03-e56177bd66ff-console-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:23.907720 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907726 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-trusted-ca-bundle\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:23.907969 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907739 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-service-ca\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:23.907969 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907752 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-oauth-serving-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:23.907969 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907764 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78xmt\" (UniqueName: \"kubernetes.io/projected/8fef7932-6121-413e-bc03-e56177bd66ff-kube-api-access-78xmt\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:23.907969 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:23.907777 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fef7932-6121-413e-bc03-e56177bd66ff-console-config\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:42:24.472345 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.472318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7dd9967dc7-r8jjk_8fef7932-6121-413e-bc03-e56177bd66ff/console/0.log" Apr 21 15:42:24.472538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.472358 2576 generic.go:358] "Generic (PLEG): container finished" podID="8fef7932-6121-413e-bc03-e56177bd66ff" containerID="a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21" exitCode=2 Apr 21 15:42:24.472538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.472393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dd9967dc7-r8jjk" event={"ID":"8fef7932-6121-413e-bc03-e56177bd66ff","Type":"ContainerDied","Data":"a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21"} Apr 21 15:42:24.472538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.472433 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dd9967dc7-r8jjk" Apr 21 15:42:24.472538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.472440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dd9967dc7-r8jjk" event={"ID":"8fef7932-6121-413e-bc03-e56177bd66ff","Type":"ContainerDied","Data":"d852652ebb00b2ec60d528fc38c43f8c975c3039f39b3d56ac23c64c2e79ef37"} Apr 21 15:42:24.472538 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.472460 2576 scope.go:117] "RemoveContainer" containerID="a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21" Apr 21 15:42:24.481224 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.481205 2576 scope.go:117] "RemoveContainer" containerID="a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21" Apr 21 15:42:24.481472 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:42:24.481452 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21\": container with ID starting with a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21 not found: ID does not exist" containerID="a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21" Apr 21 15:42:24.481569 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.481542 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21"} err="failed to get container status \"a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21\": rpc error: code = NotFound desc = could not find container \"a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21\": container with ID starting with a12fd36cc79272451009787a3e4f593a8d7cd84d89d1f403cce9367c124a4b21 not found: ID does not exist" Apr 21 15:42:24.507761 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.507731 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7dd9967dc7-r8jjk"] Apr 21 15:42:24.520152 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.520123 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7dd9967dc7-r8jjk"] Apr 21 15:42:24.908821 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.908779 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-675zd"] Apr 21 15:42:24.909581 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.909557 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="761953ed-095c-4ca4-bf9b-32602f9feeef" containerName="manager" Apr 21 15:42:24.909581 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.909580 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="761953ed-095c-4ca4-bf9b-32602f9feeef" containerName="manager" Apr 21 15:42:24.909770 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.909609 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fef7932-6121-413e-bc03-e56177bd66ff" containerName="console" Apr 21 15:42:24.909770 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.909617 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fef7932-6121-413e-bc03-e56177bd66ff" containerName="console" Apr 21 15:42:24.909861 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.909797 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="761953ed-095c-4ca4-bf9b-32602f9feeef" containerName="manager" Apr 21 15:42:24.909861 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.909825 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fef7932-6121-413e-bc03-e56177bd66ff" containerName="console" Apr 21 15:42:24.914774 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.914747 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-zxk7h"] Apr 21 15:42:24.914925 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.914902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:24.917846 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.917824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:24.917846 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.917841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-9rjjx\"" Apr 21 15:42:24.918732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.918717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 21 15:42:24.920514 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.920469 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-2dtsh\"" Apr 21 15:42:24.920664 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.920641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 21 15:42:24.927465 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.927085 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-675zd"] Apr 21 15:42:24.928898 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:24.928877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-zxk7h"] Apr 21 15:42:25.015776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.015739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wtrl\" (UniqueName: \"kubernetes.io/projected/c75281e7-f4b2-4753-91b0-031da738e963-kube-api-access-6wtrl\") pod \"odh-model-controller-696fc77849-zxk7h\" (UID: \"c75281e7-f4b2-4753-91b0-031da738e963\") " pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.015776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.015784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdsq5\" (UniqueName: \"kubernetes.io/projected/7e967817-d37c-4c44-971c-3512adb8603d-kube-api-access-cdsq5\") pod \"model-serving-api-86f7b4b499-675zd\" (UID: \"7e967817-d37c-4c44-971c-3512adb8603d\") " pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.015993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.015843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e967817-d37c-4c44-971c-3512adb8603d-tls-certs\") pod \"model-serving-api-86f7b4b499-675zd\" (UID: \"7e967817-d37c-4c44-971c-3512adb8603d\") " pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.015993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.015910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c75281e7-f4b2-4753-91b0-031da738e963-cert\") pod \"odh-model-controller-696fc77849-zxk7h\" (UID: \"c75281e7-f4b2-4753-91b0-031da738e963\") " pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.116524 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.116471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e967817-d37c-4c44-971c-3512adb8603d-tls-certs\") pod \"model-serving-api-86f7b4b499-675zd\" (UID: \"7e967817-d37c-4c44-971c-3512adb8603d\") " pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.116688 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.116536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c75281e7-f4b2-4753-91b0-031da738e963-cert\") pod \"odh-model-controller-696fc77849-zxk7h\" (UID: \"c75281e7-f4b2-4753-91b0-031da738e963\") " pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.116688 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.116589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wtrl\" (UniqueName: \"kubernetes.io/projected/c75281e7-f4b2-4753-91b0-031da738e963-kube-api-access-6wtrl\") pod \"odh-model-controller-696fc77849-zxk7h\" (UID: \"c75281e7-f4b2-4753-91b0-031da738e963\") " pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.116688 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.116629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdsq5\" (UniqueName: \"kubernetes.io/projected/7e967817-d37c-4c44-971c-3512adb8603d-kube-api-access-cdsq5\") pod \"model-serving-api-86f7b4b499-675zd\" (UID: \"7e967817-d37c-4c44-971c-3512adb8603d\") " pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.118821 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.118796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e967817-d37c-4c44-971c-3512adb8603d-tls-certs\") pod \"model-serving-api-86f7b4b499-675zd\" (UID: \"7e967817-d37c-4c44-971c-3512adb8603d\") " pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.118943 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.118846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c75281e7-f4b2-4753-91b0-031da738e963-cert\") pod \"odh-model-controller-696fc77849-zxk7h\" (UID: \"c75281e7-f4b2-4753-91b0-031da738e963\") " pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.128920 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.128897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wtrl\" (UniqueName: \"kubernetes.io/projected/c75281e7-f4b2-4753-91b0-031da738e963-kube-api-access-6wtrl\") pod \"odh-model-controller-696fc77849-zxk7h\" (UID: \"c75281e7-f4b2-4753-91b0-031da738e963\") " pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.130420 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.130403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdsq5\" (UniqueName: \"kubernetes.io/projected/7e967817-d37c-4c44-971c-3512adb8603d-kube-api-access-cdsq5\") pod \"model-serving-api-86f7b4b499-675zd\" (UID: \"7e967817-d37c-4c44-971c-3512adb8603d\") " pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.228643 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.228558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:25.235461 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.235426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:25.402793 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.402700 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-zxk7h"] Apr 21 15:42:25.405447 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:42:25.405418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75281e7_f4b2_4753_91b0_031da738e963.slice/crio-e4f7f7289a5a0c4b09c04956f27d82cc4d31e7679ef58eb74bfead6aadec323b WatchSource:0}: Error finding container e4f7f7289a5a0c4b09c04956f27d82cc4d31e7679ef58eb74bfead6aadec323b: Status 404 returned error can't find the container with id e4f7f7289a5a0c4b09c04956f27d82cc4d31e7679ef58eb74bfead6aadec323b Apr 21 15:42:25.421819 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.421794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-675zd"] Apr 21 15:42:25.424819 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:42:25.424790 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e967817_d37c_4c44_971c_3512adb8603d.slice/crio-5f6e882625aab04adc6bb1b18428a21d96c922c57016dca240e1a4f2bdee1c8d WatchSource:0}: Error finding container 5f6e882625aab04adc6bb1b18428a21d96c922c57016dca240e1a4f2bdee1c8d: Status 404 returned error can't find the container with id 5f6e882625aab04adc6bb1b18428a21d96c922c57016dca240e1a4f2bdee1c8d Apr 21 15:42:25.480473 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.480372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-zxk7h" event={"ID":"c75281e7-f4b2-4753-91b0-031da738e963","Type":"ContainerStarted","Data":"e4f7f7289a5a0c4b09c04956f27d82cc4d31e7679ef58eb74bfead6aadec323b"} Apr 21 15:42:25.482144 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.482107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-675zd" event={"ID":"7e967817-d37c-4c44-971c-3512adb8603d","Type":"ContainerStarted","Data":"5f6e882625aab04adc6bb1b18428a21d96c922c57016dca240e1a4f2bdee1c8d"} Apr 21 15:42:25.803721 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:25.803680 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fef7932-6121-413e-bc03-e56177bd66ff" path="/var/lib/kubelet/pods/8fef7932-6121-413e-bc03-e56177bd66ff/volumes" Apr 21 15:42:29.499255 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:29.499209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-675zd" event={"ID":"7e967817-d37c-4c44-971c-3512adb8603d","Type":"ContainerStarted","Data":"3a2eb9c99a07ee773b1c3e36eaed7a2760c3e710183f4b03fef3154fe1278ea7"} Apr 21 15:42:29.499853 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:29.499316 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:42:29.500695 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:29.500670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-zxk7h" event={"ID":"c75281e7-f4b2-4753-91b0-031da738e963","Type":"ContainerStarted","Data":"4f52e81b0c237aee145a52816dff64858a4b18ce3ae21d2bd795238f88d4452f"} Apr 21 15:42:29.500792 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:29.500754 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:29.533716 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:29.533664 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-675zd" podStartSLOduration=2.022212015 podStartE2EDuration="5.533648892s" podCreationTimestamp="2026-04-21 15:42:24 +0000 UTC" firstStartedPulling="2026-04-21 15:42:25.426449643 +0000 UTC m=+434.144526968" lastFinishedPulling="2026-04-21 15:42:28.937886518 +0000 UTC m=+437.655963845" observedRunningTime="2026-04-21 15:42:29.532876882 +0000 UTC m=+438.250954231" watchObservedRunningTime="2026-04-21 15:42:29.533648892 +0000 UTC m=+438.251726239" Apr 21 15:42:29.580891 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:29.580837 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-zxk7h" podStartSLOduration=2.043780082 podStartE2EDuration="5.580820722s" podCreationTimestamp="2026-04-21 15:42:24 +0000 UTC" firstStartedPulling="2026-04-21 15:42:25.406661371 +0000 UTC m=+434.124738702" lastFinishedPulling="2026-04-21 15:42:28.943702016 +0000 UTC m=+437.661779342" observedRunningTime="2026-04-21 15:42:29.580522733 +0000 UTC m=+438.298600075" watchObservedRunningTime="2026-04-21 15:42:29.580820722 +0000 UTC m=+438.298898070" Apr 21 15:42:40.507386 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:40.507352 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-zxk7h" Apr 21 15:42:40.510151 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:42:40.510132 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-675zd" Apr 21 15:43:01.028219 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.028175 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj"] Apr 21 15:43:01.035301 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.035279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:01.038959 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.038919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xqtsr\"" Apr 21 15:43:01.042467 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.042440 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj"] Apr 21 15:43:01.141341 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.141303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88819024-ef88-4177-9ec6-3d7b2fe18065-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj\" (UID: \"88819024-ef88-4177-9ec6-3d7b2fe18065\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:01.242782 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.242739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88819024-ef88-4177-9ec6-3d7b2fe18065-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj\" (UID: \"88819024-ef88-4177-9ec6-3d7b2fe18065\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:01.243136 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.243113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88819024-ef88-4177-9ec6-3d7b2fe18065-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj\" (UID: \"88819024-ef88-4177-9ec6-3d7b2fe18065\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:01.346260 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.346226 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:01.475918 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.475895 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj"] Apr 21 15:43:01.478205 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:43:01.478174 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88819024_ef88_4177_9ec6_3d7b2fe18065.slice/crio-526659364c26f3a8c6cb6b865e1808550959a42b109ef19f2dfa86597f1532d7 WatchSource:0}: Error finding container 526659364c26f3a8c6cb6b865e1808550959a42b109ef19f2dfa86597f1532d7: Status 404 returned error can't find the container with id 526659364c26f3a8c6cb6b865e1808550959a42b109ef19f2dfa86597f1532d7 Apr 21 15:43:01.617595 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:01.617509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerStarted","Data":"526659364c26f3a8c6cb6b865e1808550959a42b109ef19f2dfa86597f1532d7"} Apr 21 15:43:06.637712 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:06.637671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerStarted","Data":"75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386"} Apr 21 15:43:09.651093 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:09.650997 2576 generic.go:358] "Generic (PLEG): container finished" podID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerID="75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386" exitCode=0 Apr 21 15:43:09.651093 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:09.651069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerDied","Data":"75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386"} Apr 21 15:43:23.711114 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:23.711068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerStarted","Data":"2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a"} Apr 21 15:43:25.722977 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:25.722929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerStarted","Data":"26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e"} Apr 21 15:43:25.723526 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:25.723164 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:25.724789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:25.724738 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:43:26.727023 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:26.726987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:43:26.727510 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:26.727142 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:43:26.728183 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:26.728160 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:43:27.730143 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:27.730105 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:43:27.730577 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:27.730482 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:43:37.730293 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:37.730240 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:43:37.731027 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:37.730775 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:43:47.730818 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:47.730764 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:43:47.731289 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:47.731211 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:43:57.730649 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:57.730547 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:43:57.731094 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:43:57.731024 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:44:07.730250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:07.730201 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:44:07.730772 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:07.730728 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:44:17.730899 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:17.730852 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:44:17.731406 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:17.731380 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:44:27.730505 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:27.730449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:44:27.731044 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:27.731020 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:44:30.798693 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:30.798660 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:44:30.799116 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:30.798801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:44:30.845417 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:30.845356 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podStartSLOduration=66.049203006 podStartE2EDuration="1m29.845337711s" podCreationTimestamp="2026-04-21 15:43:01 +0000 UTC" firstStartedPulling="2026-04-21 15:43:01.480051431 +0000 UTC m=+470.198128757" lastFinishedPulling="2026-04-21 15:43:25.276186135 +0000 UTC m=+493.994263462" observedRunningTime="2026-04-21 15:43:25.752139145 +0000 UTC m=+494.470216494" watchObservedRunningTime="2026-04-21 15:44:30.845337711 +0000 UTC m=+559.563415059" Apr 21 15:44:46.309028 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.308988 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj"] Apr 21 15:44:46.309528 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.309288 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" containerID="cri-o://2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a" gracePeriod=30 Apr 21 15:44:46.309528 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.309316 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" containerID="cri-o://26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e" gracePeriod=30 Apr 21 15:44:46.652424 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.652324 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb"] Apr 21 15:44:46.655695 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.655671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:44:46.668171 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.668142 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb"] Apr 21 15:44:46.702095 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.702057 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e99a0f4-3071-46ed-a767-89c498271f7e-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb\" (UID: \"1e99a0f4-3071-46ed-a767-89c498271f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:44:46.802592 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.802556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e99a0f4-3071-46ed-a767-89c498271f7e-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb\" (UID: \"1e99a0f4-3071-46ed-a767-89c498271f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:44:46.802939 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.802919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e99a0f4-3071-46ed-a767-89c498271f7e-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb\" (UID: \"1e99a0f4-3071-46ed-a767-89c498271f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:44:46.947075 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.946987 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh"] Apr 21 15:44:46.950183 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.950166 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:44:46.966282 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.966259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:44:46.976501 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:46.976450 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh"] Apr 21 15:44:47.004228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:47.004198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afca5dff-afe5-41f5-94b9-2ef013483902-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh\" (UID: \"afca5dff-afe5-41f5-94b9-2ef013483902\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:44:47.105247 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:47.105210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afca5dff-afe5-41f5-94b9-2ef013483902-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh\" (UID: \"afca5dff-afe5-41f5-94b9-2ef013483902\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:44:47.105603 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:47.105585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afca5dff-afe5-41f5-94b9-2ef013483902-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh\" (UID: \"afca5dff-afe5-41f5-94b9-2ef013483902\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:44:47.127416 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:47.127384 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb"] Apr 21 15:44:47.130519 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:44:47.130455 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e99a0f4_3071_46ed_a767_89c498271f7e.slice/crio-330bbc060fd9fc11c9d1f83d2e3716aeaf08558e4fa1e2273ca369de2ae4e827 WatchSource:0}: Error finding container 330bbc060fd9fc11c9d1f83d2e3716aeaf08558e4fa1e2273ca369de2ae4e827: Status 404 returned error can't find the container with id 330bbc060fd9fc11c9d1f83d2e3716aeaf08558e4fa1e2273ca369de2ae4e827 Apr 21 15:44:47.260995 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:47.260954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:44:47.435724 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:47.435697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh"] Apr 21 15:44:47.437646 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:44:47.437617 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafca5dff_afe5_41f5_94b9_2ef013483902.slice/crio-ae27e1d0153db17f38d39f45e1856efe40e3d6b7419fdfe1044b8bab29f009a6 WatchSource:0}: Error finding container ae27e1d0153db17f38d39f45e1856efe40e3d6b7419fdfe1044b8bab29f009a6: Status 404 returned error can't find the container with id ae27e1d0153db17f38d39f45e1856efe40e3d6b7419fdfe1044b8bab29f009a6 Apr 21 15:44:48.025872 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:48.025827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" event={"ID":"1e99a0f4-3071-46ed-a767-89c498271f7e","Type":"ContainerStarted","Data":"1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe"} Apr 21 15:44:48.026064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:48.025880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" event={"ID":"1e99a0f4-3071-46ed-a767-89c498271f7e","Type":"ContainerStarted","Data":"330bbc060fd9fc11c9d1f83d2e3716aeaf08558e4fa1e2273ca369de2ae4e827"} Apr 21 15:44:48.027390 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:48.027360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" event={"ID":"afca5dff-afe5-41f5-94b9-2ef013483902","Type":"ContainerStarted","Data":"541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2"} Apr 21 15:44:48.027527 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:48.027399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" event={"ID":"afca5dff-afe5-41f5-94b9-2ef013483902","Type":"ContainerStarted","Data":"ae27e1d0153db17f38d39f45e1856efe40e3d6b7419fdfe1044b8bab29f009a6"} Apr 21 15:44:50.798157 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:50.798117 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:44:50.798570 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:50.798451 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:44:51.040547 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:51.040480 2576 generic.go:358] "Generic (PLEG): container finished" podID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerID="2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a" exitCode=0 Apr 21 15:44:51.040547 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:51.040522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerDied","Data":"2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a"} Apr 21 15:44:52.044631 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:52.044593 2576 generic.go:358] "Generic (PLEG): container finished" podID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerID="1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe" exitCode=0 Apr 21 15:44:52.045080 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:52.044662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" event={"ID":"1e99a0f4-3071-46ed-a767-89c498271f7e","Type":"ContainerDied","Data":"1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe"} Apr 21 15:44:52.046213 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:52.046186 2576 generic.go:358] "Generic (PLEG): container finished" podID="afca5dff-afe5-41f5-94b9-2ef013483902" containerID="541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2" exitCode=0 Apr 21 15:44:52.046298 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:52.046245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" event={"ID":"afca5dff-afe5-41f5-94b9-2ef013483902","Type":"ContainerDied","Data":"541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2"} Apr 21 15:44:53.052669 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:53.052623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" event={"ID":"1e99a0f4-3071-46ed-a767-89c498271f7e","Type":"ContainerStarted","Data":"801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c"} Apr 21 15:44:53.053149 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:53.053003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:44:53.054698 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:53.054610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:44:53.092649 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:53.092586 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podStartSLOduration=7.092563669 podStartE2EDuration="7.092563669s" podCreationTimestamp="2026-04-21 15:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:44:53.09061547 +0000 UTC m=+581.808692819" watchObservedRunningTime="2026-04-21 15:44:53.092563669 +0000 UTC m=+581.810641014" Apr 21 15:44:54.057665 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:44:54.057616 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:45:00.798779 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:00.798722 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:45:00.799322 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:00.799083 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:45:04.057838 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:04.057779 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:45:10.798404 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:10.798365 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 15:45:10.798895 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:10.798502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:45:10.798895 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:10.798744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:45:10.798895 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:10.798878 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:45:11.128601 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:11.128563 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" event={"ID":"afca5dff-afe5-41f5-94b9-2ef013483902","Type":"ContainerStarted","Data":"0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919"} Apr 21 15:45:11.128873 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:11.128852 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:45:11.130401 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:11.130367 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:45:11.180182 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:11.180127 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podStartSLOduration=6.971345319 podStartE2EDuration="25.180108065s" podCreationTimestamp="2026-04-21 15:44:46 +0000 UTC" firstStartedPulling="2026-04-21 15:44:52.047369053 +0000 UTC m=+580.765446379" lastFinishedPulling="2026-04-21 15:45:10.256131796 +0000 UTC m=+598.974209125" observedRunningTime="2026-04-21 15:45:11.178617254 +0000 UTC m=+599.896694638" watchObservedRunningTime="2026-04-21 15:45:11.180108065 +0000 UTC m=+599.898185412" Apr 21 15:45:11.755588 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:11.755556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:45:11.757020 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:11.756996 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:45:12.132681 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:12.132645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:45:14.057594 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:14.057539 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:45:16.464406 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:16.464378 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:45:16.579513 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:16.579399 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88819024-ef88-4177-9ec6-3d7b2fe18065-kserve-provision-location\") pod \"88819024-ef88-4177-9ec6-3d7b2fe18065\" (UID: \"88819024-ef88-4177-9ec6-3d7b2fe18065\") " Apr 21 15:45:16.579776 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:16.579746 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88819024-ef88-4177-9ec6-3d7b2fe18065-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88819024-ef88-4177-9ec6-3d7b2fe18065" (UID: "88819024-ef88-4177-9ec6-3d7b2fe18065"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:16.680990 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:16.680932 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88819024-ef88-4177-9ec6-3d7b2fe18065-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:45:17.155470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.155428 2576 generic.go:358] "Generic (PLEG): container finished" podID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerID="26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e" exitCode=0 Apr 21 15:45:17.155699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.155527 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" Apr 21 15:45:17.155699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.155525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerDied","Data":"26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e"} Apr 21 15:45:17.155699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.155578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj" event={"ID":"88819024-ef88-4177-9ec6-3d7b2fe18065","Type":"ContainerDied","Data":"526659364c26f3a8c6cb6b865e1808550959a42b109ef19f2dfa86597f1532d7"} Apr 21 15:45:17.155699 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.155600 2576 scope.go:117] "RemoveContainer" containerID="26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e" Apr 21 15:45:17.164503 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.164475 2576 scope.go:117] "RemoveContainer" containerID="2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a" Apr 21 15:45:17.172339 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.172321 2576 scope.go:117] "RemoveContainer" containerID="75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386" Apr 21 15:45:17.180143 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.180121 2576 scope.go:117] "RemoveContainer" containerID="26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e" Apr 21 15:45:17.180430 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:45:17.180410 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e\": container with ID starting with 26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e not found: ID does not exist" containerID="26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e" Apr 21 15:45:17.180532 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.180457 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e"} err="failed to get container status \"26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e\": rpc error: code = NotFound desc = could not find container \"26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e\": container with ID starting with 26ae01a850576e2c1c0cf7a4144483f9fb66643464d15e96ff9ab59acfc9227e not found: ID does not exist" Apr 21 15:45:17.180532 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.180478 2576 scope.go:117] "RemoveContainer" containerID="2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a" Apr 21 15:45:17.180770 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:45:17.180747 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a\": container with ID starting with 2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a not found: ID does not exist" containerID="2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a" Apr 21 15:45:17.180887 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.180782 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a"} err="failed to get container status \"2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a\": rpc error: code = NotFound desc = could not find container \"2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a\": container with ID starting with 2416e12c99a88a8483d8815ea967817f4e13cf2fb672af7ea18e531db247068a not found: ID does not exist" Apr 21 15:45:17.180887 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.180806 2576 scope.go:117] "RemoveContainer" containerID="75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386" Apr 21 15:45:17.181043 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:45:17.181025 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386\": container with ID starting with 75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386 not found: ID does not exist" containerID="75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386" Apr 21 15:45:17.181080 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.181049 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386"} err="failed to get container status \"75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386\": rpc error: code = NotFound desc = could not find container \"75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386\": container with ID starting with 75ec4a69106389e64b876879ad841a49178556364278868fcd2858aecb193386 not found: ID does not exist" Apr 21 15:45:17.194419 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.194387 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj"] Apr 21 15:45:17.207110 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.207080 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-03d5d-predictor-6c5d487c5c-67tvj"] Apr 21 15:45:17.801811 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:17.801778 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" path="/var/lib/kubelet/pods/88819024-ef88-4177-9ec6-3d7b2fe18065/volumes" Apr 21 15:45:22.132787 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:22.132741 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:45:24.057714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:24.057664 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:45:32.132795 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:32.132752 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:45:34.058042 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:34.057996 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:45:42.132872 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:42.132818 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:45:44.058042 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:44.057989 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:45:52.133319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:52.133270 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:45:54.058904 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:45:54.058853 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 21 15:46:02.133156 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:02.133103 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 21 15:46:04.057999 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:04.057963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:46:12.133722 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:12.133680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:46:26.709642 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.709608 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb"] Apr 21 15:46:26.710149 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.709981 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" containerID="cri-o://801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c" gracePeriod=30 Apr 21 15:46:26.762012 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.761979 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4"] Apr 21 15:46:26.762462 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762444 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="storage-initializer" Apr 21 15:46:26.762589 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762464 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="storage-initializer" Apr 21 15:46:26.762589 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762505 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" Apr 21 15:46:26.762589 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762515 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" Apr 21 15:46:26.762589 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762533 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" Apr 21 15:46:26.762589 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762542 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" Apr 21 15:46:26.762831 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762644 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="kserve-container" Apr 21 15:46:26.762831 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.762660 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="88819024-ef88-4177-9ec6-3d7b2fe18065" containerName="agent" Apr 21 15:46:26.765769 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.765747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:46:26.778640 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.778613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4"] Apr 21 15:46:26.831993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.831958 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc"] Apr 21 15:46:26.835254 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.835234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:46:26.845178 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.845137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc"] Apr 21 15:46:26.868039 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.867994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf28ca68-7e3c-46d1-8877-a58305cadd3b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4\" (UID: \"bf28ca68-7e3c-46d1-8877-a58305cadd3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:46:26.940505 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.940462 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh"] Apr 21 15:46:26.940825 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.940784 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" containerID="cri-o://0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919" gracePeriod=30 Apr 21 15:46:26.969298 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.969211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c39f490-81c3-4bac-9581-b765f2047827-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc\" (UID: \"0c39f490-81c3-4bac-9581-b765f2047827\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:46:26.969435 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.969340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf28ca68-7e3c-46d1-8877-a58305cadd3b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4\" (UID: \"bf28ca68-7e3c-46d1-8877-a58305cadd3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:46:26.969780 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:26.969757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf28ca68-7e3c-46d1-8877-a58305cadd3b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4\" (UID: \"bf28ca68-7e3c-46d1-8877-a58305cadd3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:46:27.070105 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.070066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c39f490-81c3-4bac-9581-b765f2047827-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc\" (UID: \"0c39f490-81c3-4bac-9581-b765f2047827\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:46:27.070462 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.070442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c39f490-81c3-4bac-9581-b765f2047827-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc\" (UID: \"0c39f490-81c3-4bac-9581-b765f2047827\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:46:27.078987 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.078955 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:46:27.146829 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.146801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:46:27.223363 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.223273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4"] Apr 21 15:46:27.226268 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:46:27.226234 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf28ca68_7e3c_46d1_8877_a58305cadd3b.slice/crio-237da7b50d86110e67d14c99c8db60975517853da974cf8f4e0f25fe53e0a7b8 WatchSource:0}: Error finding container 237da7b50d86110e67d14c99c8db60975517853da974cf8f4e0f25fe53e0a7b8: Status 404 returned error can't find the container with id 237da7b50d86110e67d14c99c8db60975517853da974cf8f4e0f25fe53e0a7b8 Apr 21 15:46:27.228848 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.228814 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:46:27.303280 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.303253 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc"] Apr 21 15:46:27.306839 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:46:27.306811 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c39f490_81c3_4bac_9581_b765f2047827.slice/crio-86ed63fc8fc1533d01b8752e9148cb7b4dd39f50cff151c5b8941a62958a5555 WatchSource:0}: Error finding container 86ed63fc8fc1533d01b8752e9148cb7b4dd39f50cff151c5b8941a62958a5555: Status 404 returned error can't find the container with id 86ed63fc8fc1533d01b8752e9148cb7b4dd39f50cff151c5b8941a62958a5555 Apr 21 15:46:27.404349 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.404297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" event={"ID":"0c39f490-81c3-4bac-9581-b765f2047827","Type":"ContainerStarted","Data":"c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d"} Apr 21 15:46:27.404566 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.404361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" event={"ID":"0c39f490-81c3-4bac-9581-b765f2047827","Type":"ContainerStarted","Data":"86ed63fc8fc1533d01b8752e9148cb7b4dd39f50cff151c5b8941a62958a5555"} Apr 21 15:46:27.405821 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.405786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" event={"ID":"bf28ca68-7e3c-46d1-8877-a58305cadd3b","Type":"ContainerStarted","Data":"742eb41f06104ea3ceab4c5011abc4a7312e37de98b384bfb234aeee72337391"} Apr 21 15:46:27.405821 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:27.405821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" event={"ID":"bf28ca68-7e3c-46d1-8877-a58305cadd3b","Type":"ContainerStarted","Data":"237da7b50d86110e67d14c99c8db60975517853da974cf8f4e0f25fe53e0a7b8"} Apr 21 15:46:30.992680 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:30.992649 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:46:31.108751 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.108719 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afca5dff-afe5-41f5-94b9-2ef013483902-kserve-provision-location\") pod \"afca5dff-afe5-41f5-94b9-2ef013483902\" (UID: \"afca5dff-afe5-41f5-94b9-2ef013483902\") " Apr 21 15:46:31.109036 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.109013 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afca5dff-afe5-41f5-94b9-2ef013483902-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "afca5dff-afe5-41f5-94b9-2ef013483902" (UID: "afca5dff-afe5-41f5-94b9-2ef013483902"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:46:31.209929 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.209890 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afca5dff-afe5-41f5-94b9-2ef013483902-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:46:31.424012 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.423975 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerID="742eb41f06104ea3ceab4c5011abc4a7312e37de98b384bfb234aeee72337391" exitCode=0 Apr 21 15:46:31.424183 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.424040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" event={"ID":"bf28ca68-7e3c-46d1-8877-a58305cadd3b","Type":"ContainerDied","Data":"742eb41f06104ea3ceab4c5011abc4a7312e37de98b384bfb234aeee72337391"} Apr 21 15:46:31.425588 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.425561 2576 generic.go:358] "Generic (PLEG): container finished" podID="afca5dff-afe5-41f5-94b9-2ef013483902" containerID="0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919" exitCode=0 Apr 21 15:46:31.425709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.425596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" event={"ID":"afca5dff-afe5-41f5-94b9-2ef013483902","Type":"ContainerDied","Data":"0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919"} Apr 21 15:46:31.425709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.425630 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" Apr 21 15:46:31.425709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.425638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh" event={"ID":"afca5dff-afe5-41f5-94b9-2ef013483902","Type":"ContainerDied","Data":"ae27e1d0153db17f38d39f45e1856efe40e3d6b7419fdfe1044b8bab29f009a6"} Apr 21 15:46:31.425709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.425659 2576 scope.go:117] "RemoveContainer" containerID="0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919" Apr 21 15:46:31.427100 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.427078 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c39f490-81c3-4bac-9581-b765f2047827" containerID="c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d" exitCode=0 Apr 21 15:46:31.427217 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.427144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" event={"ID":"0c39f490-81c3-4bac-9581-b765f2047827","Type":"ContainerDied","Data":"c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d"} Apr 21 15:46:31.439371 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.439346 2576 scope.go:117] "RemoveContainer" containerID="541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2" Apr 21 15:46:31.451392 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.451364 2576 scope.go:117] "RemoveContainer" containerID="0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919" Apr 21 15:46:31.451773 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:46:31.451740 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919\": container with ID starting with 0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919 not found: ID does not exist" containerID="0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919" Apr 21 15:46:31.451935 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.451786 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919"} err="failed to get container status \"0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919\": rpc error: code = NotFound desc = could not find container \"0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919\": container with ID starting with 0ffafb56b1431ed73ce9edd02ea3a8f1a7d617e0591f1c95ba85611b012ef919 not found: ID does not exist" Apr 21 15:46:31.451935 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.451809 2576 scope.go:117] "RemoveContainer" containerID="541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2" Apr 21 15:46:31.452062 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:46:31.452035 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2\": container with ID starting with 541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2 not found: ID does not exist" containerID="541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2" Apr 21 15:46:31.452126 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.452063 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2"} err="failed to get container status \"541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2\": rpc error: code = NotFound desc = could not find container \"541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2\": container with ID starting with 541ed74c08f166635cdb9186ea0b502ae643acd8db51f6096283405c310038d2 not found: ID does not exist" Apr 21 15:46:31.464389 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.464331 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh"] Apr 21 15:46:31.467632 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.467608 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-4bd9d-predictor-cb5d76c5-xrhlh"] Apr 21 15:46:31.556249 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.556227 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:46:31.714756 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.714662 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e99a0f4-3071-46ed-a767-89c498271f7e-kserve-provision-location\") pod \"1e99a0f4-3071-46ed-a767-89c498271f7e\" (UID: \"1e99a0f4-3071-46ed-a767-89c498271f7e\") " Apr 21 15:46:31.714993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.714971 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e99a0f4-3071-46ed-a767-89c498271f7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1e99a0f4-3071-46ed-a767-89c498271f7e" (UID: "1e99a0f4-3071-46ed-a767-89c498271f7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:46:31.804263 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.804235 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" path="/var/lib/kubelet/pods/afca5dff-afe5-41f5-94b9-2ef013483902/volumes" Apr 21 15:46:31.815362 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:31.815337 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e99a0f4-3071-46ed-a767-89c498271f7e-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:46:32.432352 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.432312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" event={"ID":"bf28ca68-7e3c-46d1-8877-a58305cadd3b","Type":"ContainerStarted","Data":"3bca672dc54011393f04a9864520500987a873b5b3c5a20a327703eea3f65858"} Apr 21 15:46:32.432876 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.432657 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:46:32.434543 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.434484 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:46:32.434767 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.434746 2576 generic.go:358] "Generic (PLEG): container finished" podID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerID="801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c" exitCode=0 Apr 21 15:46:32.434860 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.434809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" event={"ID":"1e99a0f4-3071-46ed-a767-89c498271f7e","Type":"ContainerDied","Data":"801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c"} Apr 21 15:46:32.434860 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.434818 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" Apr 21 15:46:32.434860 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.434849 2576 scope.go:117] "RemoveContainer" containerID="801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c" Apr 21 15:46:32.434985 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.434835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb" event={"ID":"1e99a0f4-3071-46ed-a767-89c498271f7e","Type":"ContainerDied","Data":"330bbc060fd9fc11c9d1f83d2e3716aeaf08558e4fa1e2273ca369de2ae4e827"} Apr 21 15:46:32.436670 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.436643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" event={"ID":"0c39f490-81c3-4bac-9581-b765f2047827","Type":"ContainerStarted","Data":"fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b"} Apr 21 15:46:32.436964 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.436941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:46:32.438107 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.438083 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:46:32.443442 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.443427 2576 scope.go:117] "RemoveContainer" containerID="1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe" Apr 21 15:46:32.451518 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.451473 2576 scope.go:117] "RemoveContainer" containerID="801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c" Apr 21 15:46:32.451821 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:46:32.451801 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c\": container with ID starting with 801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c not found: ID does not exist" containerID="801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c" Apr 21 15:46:32.451871 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.451834 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c"} err="failed to get container status \"801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c\": rpc error: code = NotFound desc = could not find container \"801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c\": container with ID starting with 801d63b7abaf72e39a8f4a403e0b8b9830f628d29598ee644b07c72b2928f38c not found: ID does not exist" Apr 21 15:46:32.451871 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.451855 2576 scope.go:117] "RemoveContainer" containerID="1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe" Apr 21 15:46:32.452112 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:46:32.452091 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe\": container with ID starting with 1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe not found: ID does not exist" containerID="1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe" Apr 21 15:46:32.452179 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.452128 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe"} err="failed to get container status \"1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe\": rpc error: code = NotFound desc = could not find container \"1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe\": container with ID starting with 1472fd85a2b1a9094c6365e4ad9ebe2ad570249b750405ff7038fa8b121857fe not found: ID does not exist" Apr 21 15:46:32.468750 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.468694 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podStartSLOduration=6.468677674 podStartE2EDuration="6.468677674s" podCreationTimestamp="2026-04-21 15:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:46:32.465366338 +0000 UTC m=+681.183443687" watchObservedRunningTime="2026-04-21 15:46:32.468677674 +0000 UTC m=+681.186755022" Apr 21 15:46:32.489995 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.489960 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb"] Apr 21 15:46:32.493551 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.493481 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-4bd9d-predictor-85f57b9f67-q49pb"] Apr 21 15:46:32.519991 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:32.519939 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podStartSLOduration=6.519921995 podStartE2EDuration="6.519921995s" podCreationTimestamp="2026-04-21 15:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:46:32.519264202 +0000 UTC m=+681.237341553" watchObservedRunningTime="2026-04-21 15:46:32.519921995 +0000 UTC m=+681.237999343" Apr 21 15:46:33.441355 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:33.441303 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:46:33.441820 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:33.441417 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:46:33.803223 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:33.803181 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" path="/var/lib/kubelet/pods/1e99a0f4-3071-46ed-a767-89c498271f7e/volumes" Apr 21 15:46:43.441542 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:43.441464 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:46:43.441542 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:43.441464 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:46:53.441838 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:53.441744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:46:53.442235 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:46:53.441748 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:47:03.441578 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:03.441524 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:47:03.442028 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:03.441524 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:47:13.442173 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:13.442119 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:47:13.442628 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:13.442119 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:47:23.442322 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:23.442266 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:47:23.442850 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:23.442266 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 15:47:33.442165 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:33.442121 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:47:33.443266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:33.443245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:47:36.797759 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:36.797713 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 15:47:46.798819 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:47:46.798782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:48:17.058714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.058679 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4"] Apr 21 15:48:17.059211 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.058977 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" containerID="cri-o://3bca672dc54011393f04a9864520500987a873b5b3c5a20a327703eea3f65858" gracePeriod=30 Apr 21 15:48:17.098293 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098256 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5"] Apr 21 15:48:17.098673 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098659 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098674 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098699 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="storage-initializer" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098706 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="storage-initializer" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098712 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098718 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098727 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="storage-initializer" Apr 21 15:48:17.098732 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098732 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="storage-initializer" Apr 21 15:48:17.098966 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098798 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e99a0f4-3071-46ed-a767-89c498271f7e" containerName="kserve-container" Apr 21 15:48:17.098966 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.098808 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="afca5dff-afe5-41f5-94b9-2ef013483902" containerName="kserve-container" Apr 21 15:48:17.101855 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.101833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" Apr 21 15:48:17.109198 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.109155 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5"] Apr 21 15:48:17.113373 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.113351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" Apr 21 15:48:17.160816 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.160783 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc"] Apr 21 15:48:17.161166 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.161119 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" containerID="cri-o://fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b" gracePeriod=30 Apr 21 15:48:17.249467 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.249431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5"] Apr 21 15:48:17.252080 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:48:17.252052 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cdb1f93_9c12_47ac_9a6f_61cb0bb8ae31.slice/crio-07476ee443f08d3a2e73956be5ce596d0513402f4ac6881a87f1735b63ee66f8 WatchSource:0}: Error finding container 07476ee443f08d3a2e73956be5ce596d0513402f4ac6881a87f1735b63ee66f8: Status 404 returned error can't find the container with id 07476ee443f08d3a2e73956be5ce596d0513402f4ac6881a87f1735b63ee66f8 Apr 21 15:48:17.804503 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:17.804451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" event={"ID":"6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31","Type":"ContainerStarted","Data":"07476ee443f08d3a2e73956be5ce596d0513402f4ac6881a87f1735b63ee66f8"} Apr 21 15:48:18.808868 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:18.808837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" event={"ID":"6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31","Type":"ContainerStarted","Data":"76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa"} Apr 21 15:48:18.809333 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:18.808994 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" Apr 21 15:48:18.810627 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:18.810606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" Apr 21 15:48:18.824328 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:18.824273 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" podStartSLOduration=0.707962865 podStartE2EDuration="1.824257304s" podCreationTimestamp="2026-04-21 15:48:17 +0000 UTC" firstStartedPulling="2026-04-21 15:48:17.254047031 +0000 UTC m=+785.972124357" lastFinishedPulling="2026-04-21 15:48:18.370341452 +0000 UTC m=+787.088418796" observedRunningTime="2026-04-21 15:48:18.823803154 +0000 UTC m=+787.541880526" watchObservedRunningTime="2026-04-21 15:48:18.824257304 +0000 UTC m=+787.542334655" Apr 21 15:48:21.310198 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.310169 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:48:21.462314 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.462224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c39f490-81c3-4bac-9581-b765f2047827-kserve-provision-location\") pod \"0c39f490-81c3-4bac-9581-b765f2047827\" (UID: \"0c39f490-81c3-4bac-9581-b765f2047827\") " Apr 21 15:48:21.462650 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.462626 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c39f490-81c3-4bac-9581-b765f2047827-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0c39f490-81c3-4bac-9581-b765f2047827" (UID: "0c39f490-81c3-4bac-9581-b765f2047827"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:48:21.563074 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.563036 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c39f490-81c3-4bac-9581-b765f2047827-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:48:21.820993 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.820961 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c39f490-81c3-4bac-9581-b765f2047827" containerID="fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b" exitCode=0 Apr 21 15:48:21.821154 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.821036 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" Apr 21 15:48:21.821154 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.821035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" event={"ID":"0c39f490-81c3-4bac-9581-b765f2047827","Type":"ContainerDied","Data":"fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b"} Apr 21 15:48:21.821154 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.821079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc" event={"ID":"0c39f490-81c3-4bac-9581-b765f2047827","Type":"ContainerDied","Data":"86ed63fc8fc1533d01b8752e9148cb7b4dd39f50cff151c5b8941a62958a5555"} Apr 21 15:48:21.821154 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.821100 2576 scope.go:117] "RemoveContainer" containerID="fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b" Apr 21 15:48:21.823607 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.823579 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerID="3bca672dc54011393f04a9864520500987a873b5b3c5a20a327703eea3f65858" exitCode=0 Apr 21 15:48:21.823722 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.823696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" event={"ID":"bf28ca68-7e3c-46d1-8877-a58305cadd3b","Type":"ContainerDied","Data":"3bca672dc54011393f04a9864520500987a873b5b3c5a20a327703eea3f65858"} Apr 21 15:48:21.830319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.830257 2576 scope.go:117] "RemoveContainer" containerID="c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d" Apr 21 15:48:21.840314 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.840254 2576 scope.go:117] "RemoveContainer" containerID="fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b" Apr 21 15:48:21.841446 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:48:21.841397 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b\": container with ID starting with fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b not found: ID does not exist" containerID="fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b" Apr 21 15:48:21.841689 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.841453 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b"} err="failed to get container status \"fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b\": rpc error: code = NotFound desc = could not find container \"fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b\": container with ID starting with fe3f657437ffe67b146d17432a5df8cdbf466cf09823bbe47df9079042c3f14b not found: ID does not exist" Apr 21 15:48:21.841689 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.841483 2576 scope.go:117] "RemoveContainer" containerID="c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d" Apr 21 15:48:21.842912 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:48:21.842883 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d\": container with ID starting with c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d not found: ID does not exist" containerID="c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d" Apr 21 15:48:21.843000 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.842921 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d"} err="failed to get container status \"c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d\": rpc error: code = NotFound desc = could not find container \"c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d\": container with ID starting with c5e6c13c9ca55dd5080e82e701388a0f29b5ded41dd572483a03b22f090b569d not found: ID does not exist" Apr 21 15:48:21.845048 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.844873 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc"] Apr 21 15:48:21.847626 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.847598 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3991c-predictor-6d58c8fd99-2n4qc"] Apr 21 15:48:21.906973 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:21.906950 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:48:22.068190 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.068156 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf28ca68-7e3c-46d1-8877-a58305cadd3b-kserve-provision-location\") pod \"bf28ca68-7e3c-46d1-8877-a58305cadd3b\" (UID: \"bf28ca68-7e3c-46d1-8877-a58305cadd3b\") " Apr 21 15:48:22.068535 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.068484 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf28ca68-7e3c-46d1-8877-a58305cadd3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bf28ca68-7e3c-46d1-8877-a58305cadd3b" (UID: "bf28ca68-7e3c-46d1-8877-a58305cadd3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:48:22.169532 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.169467 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf28ca68-7e3c-46d1-8877-a58305cadd3b-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:48:22.830004 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.829965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" event={"ID":"bf28ca68-7e3c-46d1-8877-a58305cadd3b","Type":"ContainerDied","Data":"237da7b50d86110e67d14c99c8db60975517853da974cf8f4e0f25fe53e0a7b8"} Apr 21 15:48:22.830004 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.829990 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4" Apr 21 15:48:22.830563 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.830021 2576 scope.go:117] "RemoveContainer" containerID="3bca672dc54011393f04a9864520500987a873b5b3c5a20a327703eea3f65858" Apr 21 15:48:22.839266 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.839248 2576 scope.go:117] "RemoveContainer" containerID="742eb41f06104ea3ceab4c5011abc4a7312e37de98b384bfb234aeee72337391" Apr 21 15:48:22.854054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.854023 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4"] Apr 21 15:48:22.859250 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:22.859223 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3991c-predictor-65d954f97f-xkfv4"] Apr 21 15:48:23.802656 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:23.802621 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c39f490-81c3-4bac-9581-b765f2047827" path="/var/lib/kubelet/pods/0c39f490-81c3-4bac-9581-b765f2047827/volumes" Apr 21 15:48:23.803018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:23.803005 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" path="/var/lib/kubelet/pods/bf28ca68-7e3c-46d1-8877-a58305cadd3b/volumes" Apr 21 15:48:27.199944 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.199899 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm"] Apr 21 15:48:27.200344 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200334 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="storage-initializer" Apr 21 15:48:27.200396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200348 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="storage-initializer" Apr 21 15:48:27.200396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200361 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" Apr 21 15:48:27.200396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200368 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" Apr 21 15:48:27.200396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200389 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" Apr 21 15:48:27.200396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200394 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" Apr 21 15:48:27.200599 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200407 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="storage-initializer" Apr 21 15:48:27.200599 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200413 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="storage-initializer" Apr 21 15:48:27.200599 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200468 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf28ca68-7e3c-46d1-8877-a58305cadd3b" containerName="kserve-container" Apr 21 15:48:27.200599 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.200476 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c39f490-81c3-4bac-9581-b765f2047827" containerName="kserve-container" Apr 21 15:48:27.203847 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.203827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:27.222702 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.222673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm"] Apr 21 15:48:27.315670 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.315611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9-kserve-provision-location\") pod \"isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm\" (UID: \"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9\") " pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:27.416434 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.416380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9-kserve-provision-location\") pod \"isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm\" (UID: \"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9\") " pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:27.416851 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.416825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9-kserve-provision-location\") pod \"isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm\" (UID: \"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9\") " pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:27.516568 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.516448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:27.652295 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.652208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm"] Apr 21 15:48:27.655175 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:48:27.655148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0b5fd60_7400_4218_9e2c_f8f3ea2ef9c9.slice/crio-9b423d537680155f52446ee8c7dfe8156b265cbaaa1c919ab0d63baf745f08d5 WatchSource:0}: Error finding container 9b423d537680155f52446ee8c7dfe8156b265cbaaa1c919ab0d63baf745f08d5: Status 404 returned error can't find the container with id 9b423d537680155f52446ee8c7dfe8156b265cbaaa1c919ab0d63baf745f08d5 Apr 21 15:48:27.855364 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.855326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerStarted","Data":"fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd"} Apr 21 15:48:27.855364 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:27.855365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerStarted","Data":"9b423d537680155f52446ee8c7dfe8156b265cbaaa1c919ab0d63baf745f08d5"} Apr 21 15:48:31.871597 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:31.871564 2576 generic.go:358] "Generic (PLEG): container finished" podID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerID="fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd" exitCode=0 Apr 21 15:48:31.871982 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:31.871637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerDied","Data":"fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd"} Apr 21 15:48:32.876746 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.876704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerStarted","Data":"aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f"} Apr 21 15:48:32.877228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.876757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerStarted","Data":"36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01"} Apr 21 15:48:32.877228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.877133 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:32.877228 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.877209 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:48:32.879084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.879042 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:48:32.879748 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.879725 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:32.898456 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:32.898404 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podStartSLOduration=5.898389 podStartE2EDuration="5.898389s" podCreationTimestamp="2026-04-21 15:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:48:32.895891355 +0000 UTC m=+801.613968705" watchObservedRunningTime="2026-04-21 15:48:32.898389 +0000 UTC m=+801.616466348" Apr 21 15:48:33.880595 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:33.880550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:48:33.881010 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:33.880905 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:43.880826 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:43.880768 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:48:43.881243 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:43.881190 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:53.880868 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:53.880814 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:48:53.881330 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:48:53.881229 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:49:03.881023 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:03.880965 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:49:03.881470 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:03.881444 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:49:13.881331 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:13.881267 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:49:13.881756 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:13.881711 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:49:23.880630 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:23.880577 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:49:23.881151 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:23.880974 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:49:33.881413 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:33.881370 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:49:33.881963 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:33.881938 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:49:39.802124 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:39.802094 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:49:39.802542 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:39.802145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:49:52.205787 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.205749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-61a06-predictor-cf9788b8b-9nls5_6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31/kserve-container/0.log" Apr 21 15:49:52.531039 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.530928 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm"] Apr 21 15:49:52.531275 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.531250 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" containerID="cri-o://36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01" gracePeriod=30 Apr 21 15:49:52.531379 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.531348 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" containerID="cri-o://aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f" gracePeriod=30 Apr 21 15:49:52.597573 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.597537 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8"] Apr 21 15:49:52.601460 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.601442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:49:52.616928 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.616902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8"] Apr 21 15:49:52.635419 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.635389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbf23fac-035d-482a-bbbd-0e1761f28b94-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8\" (UID: \"bbf23fac-035d-482a-bbbd-0e1761f28b94\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:49:52.733804 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.733770 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5"] Apr 21 15:49:52.734071 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.734030 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" podUID="6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31" containerName="kserve-container" containerID="cri-o://76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa" gracePeriod=30 Apr 21 15:49:52.737005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.736969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbf23fac-035d-482a-bbbd-0e1761f28b94-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8\" (UID: \"bbf23fac-035d-482a-bbbd-0e1761f28b94\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:49:52.737468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.737424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbf23fac-035d-482a-bbbd-0e1761f28b94-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8\" (UID: \"bbf23fac-035d-482a-bbbd-0e1761f28b94\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:49:52.920937 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.920906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:49:52.976308 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:52.976267 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" Apr 21 15:49:53.066620 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.066529 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8"] Apr 21 15:49:53.069403 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:49:53.069375 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf23fac_035d_482a_bbbd_0e1761f28b94.slice/crio-479b290fb85f4e479ef8a451ad020101d2ef0d6caac840ccdcc25f45ae592e04 WatchSource:0}: Error finding container 479b290fb85f4e479ef8a451ad020101d2ef0d6caac840ccdcc25f45ae592e04: Status 404 returned error can't find the container with id 479b290fb85f4e479ef8a451ad020101d2ef0d6caac840ccdcc25f45ae592e04 Apr 21 15:49:53.162274 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.162233 2576 generic.go:358] "Generic (PLEG): container finished" podID="6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31" containerID="76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa" exitCode=2 Apr 21 15:49:53.162457 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.162298 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" Apr 21 15:49:53.162457 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.162317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" event={"ID":"6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31","Type":"ContainerDied","Data":"76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa"} Apr 21 15:49:53.162457 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.162363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5" event={"ID":"6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31","Type":"ContainerDied","Data":"07476ee443f08d3a2e73956be5ce596d0513402f4ac6881a87f1735b63ee66f8"} Apr 21 15:49:53.162457 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.162379 2576 scope.go:117] "RemoveContainer" containerID="76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa" Apr 21 15:49:53.164099 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.164074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" event={"ID":"bbf23fac-035d-482a-bbbd-0e1761f28b94","Type":"ContainerStarted","Data":"846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28"} Apr 21 15:49:53.164203 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.164107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" event={"ID":"bbf23fac-035d-482a-bbbd-0e1761f28b94","Type":"ContainerStarted","Data":"479b290fb85f4e479ef8a451ad020101d2ef0d6caac840ccdcc25f45ae592e04"} Apr 21 15:49:53.172139 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.172120 2576 scope.go:117] "RemoveContainer" containerID="76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa" Apr 21 15:49:53.172398 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:49:53.172378 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa\": container with ID starting with 76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa not found: ID does not exist" containerID="76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa" Apr 21 15:49:53.172461 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.172412 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa"} err="failed to get container status \"76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa\": rpc error: code = NotFound desc = could not find container \"76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa\": container with ID starting with 76de8bfdcb179e6040f0e4048a36c4813426efa375af26bb537d24b7f2fac2fa not found: ID does not exist" Apr 21 15:49:53.213891 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.213861 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5"] Apr 21 15:49:53.220041 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.220013 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-61a06-predictor-cf9788b8b-9nls5"] Apr 21 15:49:53.803278 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:53.803224 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31" path="/var/lib/kubelet/pods/6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31/volumes" Apr 21 15:49:57.179553 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:57.179455 2576 generic.go:358] "Generic (PLEG): container finished" podID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerID="846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28" exitCode=0 Apr 21 15:49:57.179553 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:57.179530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" event={"ID":"bbf23fac-035d-482a-bbbd-0e1761f28b94","Type":"ContainerDied","Data":"846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28"} Apr 21 15:49:58.184321 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:58.184274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" event={"ID":"bbf23fac-035d-482a-bbbd-0e1761f28b94","Type":"ContainerStarted","Data":"ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c"} Apr 21 15:49:58.184819 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:58.184639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:49:58.186069 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:58.186029 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:49:58.186336 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:58.186314 2576 generic.go:358] "Generic (PLEG): container finished" podID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerID="36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01" exitCode=0 Apr 21 15:49:58.186417 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:58.186360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerDied","Data":"36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01"} Apr 21 15:49:58.225740 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:58.225684 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podStartSLOduration=6.225665783 podStartE2EDuration="6.225665783s" podCreationTimestamp="2026-04-21 15:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:49:58.224817223 +0000 UTC m=+886.942894572" watchObservedRunningTime="2026-04-21 15:49:58.225665783 +0000 UTC m=+886.943743110" Apr 21 15:49:59.189984 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:59.189938 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:49:59.798396 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:59.798338 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:49:59.798752 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:49:59.798707 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:50:09.190179 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:09.190132 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:50:09.798046 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:09.797992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:50:09.798308 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:09.798280 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:50:11.782003 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:11.781970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:50:11.784758 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:11.784733 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:50:19.190819 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:19.190774 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:50:19.798330 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:19.798276 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 15:50:19.798679 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:19.798644 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:50:19.801609 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:19.801589 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:50:19.801703 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:19.801639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:50:23.181111 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.181084 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:50:23.280193 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.280154 2576 generic.go:358] "Generic (PLEG): container finished" podID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerID="aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f" exitCode=0 Apr 21 15:50:23.280391 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.280200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerDied","Data":"aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f"} Apr 21 15:50:23.280391 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.280235 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" Apr 21 15:50:23.280391 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.280249 2576 scope.go:117] "RemoveContainer" containerID="aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f" Apr 21 15:50:23.280391 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.280233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm" event={"ID":"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9","Type":"ContainerDied","Data":"9b423d537680155f52446ee8c7dfe8156b265cbaaa1c919ab0d63baf745f08d5"} Apr 21 15:50:23.288638 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.288612 2576 scope.go:117] "RemoveContainer" containerID="36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01" Apr 21 15:50:23.296143 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.296124 2576 scope.go:117] "RemoveContainer" containerID="fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd" Apr 21 15:50:23.303502 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.303472 2576 scope.go:117] "RemoveContainer" containerID="aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f" Apr 21 15:50:23.303785 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:50:23.303763 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f\": container with ID starting with aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f not found: ID does not exist" containerID="aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f" Apr 21 15:50:23.303843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.303797 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f"} err="failed to get container status \"aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f\": rpc error: code = NotFound desc = could not find container \"aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f\": container with ID starting with aa92cf8a73b15b149bcf1940aa7d40b5f46e352c39b6987f8ca7d8a9876ee63f not found: ID does not exist" Apr 21 15:50:23.303843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.303818 2576 scope.go:117] "RemoveContainer" containerID="36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01" Apr 21 15:50:23.304057 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:50:23.304039 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01\": container with ID starting with 36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01 not found: ID does not exist" containerID="36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01" Apr 21 15:50:23.304101 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.304064 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01"} err="failed to get container status \"36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01\": rpc error: code = NotFound desc = could not find container \"36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01\": container with ID starting with 36245439b68a81568f778346459a0c10f0a7283433b7f770fff69d77d4807e01 not found: ID does not exist" Apr 21 15:50:23.304101 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.304081 2576 scope.go:117] "RemoveContainer" containerID="fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd" Apr 21 15:50:23.304326 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:50:23.304308 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd\": container with ID starting with fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd not found: ID does not exist" containerID="fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd" Apr 21 15:50:23.304380 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.304335 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd"} err="failed to get container status \"fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd\": rpc error: code = NotFound desc = could not find container \"fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd\": container with ID starting with fd0bda9ce62c069b3e51815549f3206b043ab1b3fc4c49357eee17eb27020cbd not found: ID does not exist" Apr 21 15:50:23.308596 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.308578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9-kserve-provision-location\") pod \"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9\" (UID: \"d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9\") " Apr 21 15:50:23.308887 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.308867 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" (UID: "d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:23.409561 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.409523 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:50:23.605171 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.605136 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm"] Apr 21 15:50:23.609052 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.609024 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-61a06-predictor-58dddf4bf8-8wlzm"] Apr 21 15:50:23.802078 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:23.802035 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" path="/var/lib/kubelet/pods/d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9/volumes" Apr 21 15:50:29.190736 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:29.190693 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:50:39.190261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:39.190216 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:50:49.190004 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:49.189959 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:50:59.190682 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:50:59.190630 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:51:05.797705 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:51:05.797664 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:51:15.797861 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:51:15.797807 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:51:25.797950 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:51:25.797848 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:51:35.797958 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:51:35.797900 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:51:45.798357 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:51:45.798311 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:51:55.797843 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:51:55.797788 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:52:05.798587 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:05.798537 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:52:15.798320 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:15.798275 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:52:25.802400 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:25.802366 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:52:32.695740 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.695707 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8"] Apr 21 15:52:32.696214 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.696009 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" containerID="cri-o://ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c" gracePeriod=30 Apr 21 15:52:32.785550 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785513 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv"] Apr 21 15:52:32.785926 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785913 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" Apr 21 15:52:32.785975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785930 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" Apr 21 15:52:32.785975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785948 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="storage-initializer" Apr 21 15:52:32.785975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785954 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="storage-initializer" Apr 21 15:52:32.785975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785961 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" Apr 21 15:52:32.785975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785966 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" Apr 21 15:52:32.785975 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785975 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31" containerName="kserve-container" Apr 21 15:52:32.786153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.785981 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31" containerName="kserve-container" Apr 21 15:52:32.786153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.786050 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cdb1f93-9c12-47ac-9a6f-61cb0bb8ae31" containerName="kserve-container" Apr 21 15:52:32.786153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.786059 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="agent" Apr 21 15:52:32.786153 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.786066 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0b5fd60-7400-4218-9e2c-f8f3ea2ef9c9" containerName="kserve-container" Apr 21 15:52:32.789380 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.789364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:52:32.798702 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.798658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv"] Apr 21 15:52:32.812353 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.812318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34654724-da9c-44af-a61c-ed4883f3999d-kserve-provision-location\") pod \"isvc-primary-daad18-predictor-7d747cb478-h4pvv\" (UID: \"34654724-da9c-44af-a61c-ed4883f3999d\") " pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:52:32.913742 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.913702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34654724-da9c-44af-a61c-ed4883f3999d-kserve-provision-location\") pod \"isvc-primary-daad18-predictor-7d747cb478-h4pvv\" (UID: \"34654724-da9c-44af-a61c-ed4883f3999d\") " pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:52:32.914089 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:32.914066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34654724-da9c-44af-a61c-ed4883f3999d-kserve-provision-location\") pod \"isvc-primary-daad18-predictor-7d747cb478-h4pvv\" (UID: \"34654724-da9c-44af-a61c-ed4883f3999d\") " pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:52:33.100972 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:33.100941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:52:33.228395 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:33.228364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv"] Apr 21 15:52:33.231296 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:52:33.231269 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34654724_da9c_44af_a61c_ed4883f3999d.slice/crio-6d09914bf863b35ff4b2ce72180f9cdf104eb05f4c7c3c7b0de304e369534658 WatchSource:0}: Error finding container 6d09914bf863b35ff4b2ce72180f9cdf104eb05f4c7c3c7b0de304e369534658: Status 404 returned error can't find the container with id 6d09914bf863b35ff4b2ce72180f9cdf104eb05f4c7c3c7b0de304e369534658 Apr 21 15:52:33.233453 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:33.233434 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:52:33.740464 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:33.740421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" event={"ID":"34654724-da9c-44af-a61c-ed4883f3999d","Type":"ContainerStarted","Data":"454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c"} Apr 21 15:52:33.740464 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:33.740462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" event={"ID":"34654724-da9c-44af-a61c-ed4883f3999d","Type":"ContainerStarted","Data":"6d09914bf863b35ff4b2ce72180f9cdf104eb05f4c7c3c7b0de304e369534658"} Apr 21 15:52:35.799032 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:35.798991 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 15:52:37.756125 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:37.756086 2576 generic.go:358] "Generic (PLEG): container finished" podID="34654724-da9c-44af-a61c-ed4883f3999d" containerID="454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c" exitCode=0 Apr 21 15:52:37.756556 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:37.756162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" event={"ID":"34654724-da9c-44af-a61c-ed4883f3999d","Type":"ContainerDied","Data":"454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c"} Apr 21 15:52:38.761383 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:38.761347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" event={"ID":"34654724-da9c-44af-a61c-ed4883f3999d","Type":"ContainerStarted","Data":"d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61"} Apr 21 15:52:38.761875 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:38.761732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:52:38.763061 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:38.763033 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:52:38.787064 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:38.786998 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podStartSLOduration=6.786975286 podStartE2EDuration="6.786975286s" podCreationTimestamp="2026-04-21 15:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:52:38.783803913 +0000 UTC m=+1047.501881288" watchObservedRunningTime="2026-04-21 15:52:38.786975286 +0000 UTC m=+1047.505052638" Apr 21 15:52:39.766018 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:39.765981 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:52:42.439798 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.439769 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:52:42.495091 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.495056 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbf23fac-035d-482a-bbbd-0e1761f28b94-kserve-provision-location\") pod \"bbf23fac-035d-482a-bbbd-0e1761f28b94\" (UID: \"bbf23fac-035d-482a-bbbd-0e1761f28b94\") " Apr 21 15:52:42.495389 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.495365 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf23fac-035d-482a-bbbd-0e1761f28b94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bbf23fac-035d-482a-bbbd-0e1761f28b94" (UID: "bbf23fac-035d-482a-bbbd-0e1761f28b94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:52:42.596298 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.596263 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbf23fac-035d-482a-bbbd-0e1761f28b94-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:52:42.776239 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.776203 2576 generic.go:358] "Generic (PLEG): container finished" podID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerID="ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c" exitCode=0 Apr 21 15:52:42.776436 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.776280 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" Apr 21 15:52:42.776436 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.776282 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" event={"ID":"bbf23fac-035d-482a-bbbd-0e1761f28b94","Type":"ContainerDied","Data":"ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c"} Apr 21 15:52:42.776436 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.776390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8" event={"ID":"bbf23fac-035d-482a-bbbd-0e1761f28b94","Type":"ContainerDied","Data":"479b290fb85f4e479ef8a451ad020101d2ef0d6caac840ccdcc25f45ae592e04"} Apr 21 15:52:42.776436 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.776408 2576 scope.go:117] "RemoveContainer" containerID="ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c" Apr 21 15:52:42.785383 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.785363 2576 scope.go:117] "RemoveContainer" containerID="846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28" Apr 21 15:52:42.793214 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.793194 2576 scope.go:117] "RemoveContainer" containerID="ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c" Apr 21 15:52:42.793481 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:52:42.793461 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c\": container with ID starting with ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c not found: ID does not exist" containerID="ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c" Apr 21 15:52:42.793557 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.793511 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c"} err="failed to get container status \"ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c\": rpc error: code = NotFound desc = could not find container \"ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c\": container with ID starting with ef33820cbb53c64ec99e22d69e754b0a56a87525233df315eae92c1732e36d7c not found: ID does not exist" Apr 21 15:52:42.793557 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.793530 2576 scope.go:117] "RemoveContainer" containerID="846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28" Apr 21 15:52:42.793763 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:52:42.793745 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28\": container with ID starting with 846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28 not found: ID does not exist" containerID="846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28" Apr 21 15:52:42.793809 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.793770 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28"} err="failed to get container status \"846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28\": rpc error: code = NotFound desc = could not find container \"846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28\": container with ID starting with 846fd3d74ffd1aedfd20b1aabe68996ac71c8605a6bedecae6bc010fe8490c28 not found: ID does not exist" Apr 21 15:52:42.811313 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.811284 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8"] Apr 21 15:52:42.817499 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:42.817459 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-1f208-predictor-55ff5c688f-d24z8"] Apr 21 15:52:43.802172 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:43.802136 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" path="/var/lib/kubelet/pods/bbf23fac-035d-482a-bbbd-0e1761f28b94/volumes" Apr 21 15:52:49.766503 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:49.766457 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:52:59.766242 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:52:59.766141 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:53:09.766570 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:09.766520 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:53:19.766994 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:19.766944 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:53:29.766451 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:29.766401 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:53:39.766685 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:39.766632 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 15:53:49.767408 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:49.767377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:53:52.981625 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.981587 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4"] Apr 21 15:53:52.982028 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.982013 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="storage-initializer" Apr 21 15:53:52.982076 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.982029 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="storage-initializer" Apr 21 15:53:52.982076 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.982047 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" Apr 21 15:53:52.982076 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.982054 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" Apr 21 15:53:52.982166 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.982110 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf23fac-035d-482a-bbbd-0e1761f28b94" containerName="kserve-container" Apr 21 15:53:52.985309 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.985292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:52.988217 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.988190 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 15:53:52.988217 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.988205 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-daad18\"" Apr 21 15:53:52.988217 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.988193 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-daad18-dockercfg-qc4lm\"" Apr 21 15:53:52.997527 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:52.997477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4"] Apr 21 15:53:53.097868 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.097828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f334e01d-09cc-425c-8ef3-0d3cbff4f175-cabundle-cert\") pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.097868 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.097867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f334e01d-09cc-425c-8ef3-0d3cbff4f175-kserve-provision-location\") pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.198484 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.198441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f334e01d-09cc-425c-8ef3-0d3cbff4f175-cabundle-cert\") pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.198484 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.198485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f334e01d-09cc-425c-8ef3-0d3cbff4f175-kserve-provision-location\") pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.198898 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.198880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f334e01d-09cc-425c-8ef3-0d3cbff4f175-kserve-provision-location\") pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.199111 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.199089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f334e01d-09cc-425c-8ef3-0d3cbff4f175-cabundle-cert\") pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.297036 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.296919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:53:53.451083 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:53.451045 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4"] Apr 21 15:53:53.454064 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:53:53.454031 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf334e01d_09cc_425c_8ef3_0d3cbff4f175.slice/crio-02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08 WatchSource:0}: Error finding container 02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08: Status 404 returned error can't find the container with id 02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08 Apr 21 15:53:54.022914 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:54.022876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" event={"ID":"f334e01d-09cc-425c-8ef3-0d3cbff4f175","Type":"ContainerStarted","Data":"1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05"} Apr 21 15:53:54.022914 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:54.022915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" event={"ID":"f334e01d-09cc-425c-8ef3-0d3cbff4f175","Type":"ContainerStarted","Data":"02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08"} Apr 21 15:53:59.041462 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:59.041431 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_f334e01d-09cc-425c-8ef3-0d3cbff4f175/storage-initializer/0.log" Apr 21 15:53:59.041866 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:59.041469 2576 generic.go:358] "Generic (PLEG): container finished" podID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerID="1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05" exitCode=1 Apr 21 15:53:59.041866 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:53:59.041554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" event={"ID":"f334e01d-09cc-425c-8ef3-0d3cbff4f175","Type":"ContainerDied","Data":"1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05"} Apr 21 15:54:00.046962 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:00.046931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_f334e01d-09cc-425c-8ef3-0d3cbff4f175/storage-initializer/0.log" Apr 21 15:54:00.047367 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:00.047011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" event={"ID":"f334e01d-09cc-425c-8ef3-0d3cbff4f175","Type":"ContainerStarted","Data":"e822f602ac3e2e3a502d08b89df28c0cdb537c730c54f209a0a92929da38e7ad"} Apr 21 15:54:02.056709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:02.056680 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_f334e01d-09cc-425c-8ef3-0d3cbff4f175/storage-initializer/1.log" Apr 21 15:54:02.057112 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:02.057022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_f334e01d-09cc-425c-8ef3-0d3cbff4f175/storage-initializer/0.log" Apr 21 15:54:02.057112 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:02.057059 2576 generic.go:358] "Generic (PLEG): container finished" podID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerID="e822f602ac3e2e3a502d08b89df28c0cdb537c730c54f209a0a92929da38e7ad" exitCode=1 Apr 21 15:54:02.057184 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:02.057117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" event={"ID":"f334e01d-09cc-425c-8ef3-0d3cbff4f175","Type":"ContainerDied","Data":"e822f602ac3e2e3a502d08b89df28c0cdb537c730c54f209a0a92929da38e7ad"} Apr 21 15:54:02.057184 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:02.057158 2576 scope.go:117] "RemoveContainer" containerID="1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05" Apr 21 15:54:02.057541 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:02.057521 2576 scope.go:117] "RemoveContainer" containerID="1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05" Apr 21 15:54:02.068339 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:02.068308 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_kserve-ci-e2e-test_f334e01d-09cc-425c-8ef3-0d3cbff4f175_0 in pod sandbox 02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08 from index: no such id: '1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05'" containerID="1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05" Apr 21 15:54:02.068403 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:02.068365 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_kserve-ci-e2e-test_f334e01d-09cc-425c-8ef3-0d3cbff4f175_0 in pod sandbox 02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08 from index: no such id: '1d19fc9f5db616e1e01b140eaf9657a506e9aab8476701bdb648a76151a41c05'; Skipping pod \"isvc-secondary-daad18-predictor-58999cdf79-vzkg4_kserve-ci-e2e-test(f334e01d-09cc-425c-8ef3-0d3cbff4f175)\"" logger="UnhandledError" Apr 21 15:54:02.069728 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:02.069708 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-daad18-predictor-58999cdf79-vzkg4_kserve-ci-e2e-test(f334e01d-09cc-425c-8ef3-0d3cbff4f175)\"" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" Apr 21 15:54:03.062660 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:03.062632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_f334e01d-09cc-425c-8ef3-0d3cbff4f175/storage-initializer/1.log" Apr 21 15:54:11.072472 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.072426 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv"] Apr 21 15:54:11.073123 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.072816 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" containerID="cri-o://d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61" gracePeriod=30 Apr 21 15:54:11.170425 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.170388 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4"] Apr 21 15:54:11.232104 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.232075 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29"] Apr 21 15:54:11.237122 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.237101 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.241032 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.241014 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-e57b3c\"" Apr 21 15:54:11.241289 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.241272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-e57b3c-dockercfg-6t8kk\"" Apr 21 15:54:11.250789 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.250722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29"] Apr 21 15:54:11.309182 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.309159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-daad18-predictor-58999cdf79-vzkg4_f334e01d-09cc-425c-8ef3-0d3cbff4f175/storage-initializer/1.log" Apr 21 15:54:11.309334 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.309226 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:54:11.356957 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.356860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48d460f5-eb59-4977-a544-3c3a8107b8d0-kserve-provision-location\") pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.356957 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.356933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48d460f5-eb59-4977-a544-3c3a8107b8d0-cabundle-cert\") pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.457778 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.457736 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f334e01d-09cc-425c-8ef3-0d3cbff4f175-cabundle-cert\") pod \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " Apr 21 15:54:11.457994 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.457850 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f334e01d-09cc-425c-8ef3-0d3cbff4f175-kserve-provision-location\") pod \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\" (UID: \"f334e01d-09cc-425c-8ef3-0d3cbff4f175\") " Apr 21 15:54:11.457994 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.457957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48d460f5-eb59-4977-a544-3c3a8107b8d0-cabundle-cert\") pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.458142 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.458007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48d460f5-eb59-4977-a544-3c3a8107b8d0-kserve-provision-location\") pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.458208 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.458130 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f334e01d-09cc-425c-8ef3-0d3cbff4f175-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f334e01d-09cc-425c-8ef3-0d3cbff4f175" (UID: "f334e01d-09cc-425c-8ef3-0d3cbff4f175"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:54:11.458208 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.458157 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f334e01d-09cc-425c-8ef3-0d3cbff4f175-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f334e01d-09cc-425c-8ef3-0d3cbff4f175" (UID: "f334e01d-09cc-425c-8ef3-0d3cbff4f175"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:11.458367 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.458318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48d460f5-eb59-4977-a544-3c3a8107b8d0-kserve-provision-location\") pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.458613 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.458594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48d460f5-eb59-4977-a544-3c3a8107b8d0-cabundle-cert\") pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.553284 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.553239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:11.559520 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.559461 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f334e01d-09cc-425c-8ef3-0d3cbff4f175-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:54:11.559656 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.559529 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f334e01d-09cc-425c-8ef3-0d3cbff4f175-cabundle-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:54:11.685918 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.685812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29"] Apr 21 15:54:11.688915 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:54:11.688878 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48d460f5_eb59_4977_a544_3c3a8107b8d0.slice/crio-4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359 WatchSource:0}: Error finding container 4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359: Status 404 returned error can't find the container with id 4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359 Apr 21 15:54:11.695156 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:11.695132 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-e57b3c\"" Apr 21 15:54:12.093552 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.093514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" event={"ID":"f334e01d-09cc-425c-8ef3-0d3cbff4f175","Type":"ContainerDied","Data":"02a4f5d3e2254e9e5046e15408ec2670560e4938ee4ea1dd98c62b1293c4cb08"} Apr 21 15:54:12.093552 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.093544 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4" Apr 21 15:54:12.094119 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.093565 2576 scope.go:117] "RemoveContainer" containerID="e822f602ac3e2e3a502d08b89df28c0cdb537c730c54f209a0a92929da38e7ad" Apr 21 15:54:12.095419 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.095392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" event={"ID":"48d460f5-eb59-4977-a544-3c3a8107b8d0","Type":"ContainerStarted","Data":"5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4"} Apr 21 15:54:12.095576 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.095425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" event={"ID":"48d460f5-eb59-4977-a544-3c3a8107b8d0","Type":"ContainerStarted","Data":"4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359"} Apr 21 15:54:12.142827 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.142791 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4"] Apr 21 15:54:12.151381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:12.151346 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-daad18-predictor-58999cdf79-vzkg4"] Apr 21 15:54:13.802329 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:13.802293 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" path="/var/lib/kubelet/pods/f334e01d-09cc-425c-8ef3-0d3cbff4f175/volumes" Apr 21 15:54:16.024541 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.024515 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:54:16.112749 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.112712 2576 generic.go:358] "Generic (PLEG): container finished" podID="34654724-da9c-44af-a61c-ed4883f3999d" containerID="d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61" exitCode=0 Apr 21 15:54:16.112936 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.112782 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" Apr 21 15:54:16.112936 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.112792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" event={"ID":"34654724-da9c-44af-a61c-ed4883f3999d","Type":"ContainerDied","Data":"d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61"} Apr 21 15:54:16.112936 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.112827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv" event={"ID":"34654724-da9c-44af-a61c-ed4883f3999d","Type":"ContainerDied","Data":"6d09914bf863b35ff4b2ce72180f9cdf104eb05f4c7c3c7b0de304e369534658"} Apr 21 15:54:16.112936 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.112844 2576 scope.go:117] "RemoveContainer" containerID="d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61" Apr 21 15:54:16.120880 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.120863 2576 scope.go:117] "RemoveContainer" containerID="454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c" Apr 21 15:54:16.128159 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.128144 2576 scope.go:117] "RemoveContainer" containerID="d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61" Apr 21 15:54:16.128400 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:16.128382 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61\": container with ID starting with d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61 not found: ID does not exist" containerID="d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61" Apr 21 15:54:16.128452 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.128409 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61"} err="failed to get container status \"d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61\": rpc error: code = NotFound desc = could not find container \"d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61\": container with ID starting with d4f301806089f86f2cb1490096f537db228de7b22792a97d79a84d899eb1da61 not found: ID does not exist" Apr 21 15:54:16.128452 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.128428 2576 scope.go:117] "RemoveContainer" containerID="454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c" Apr 21 15:54:16.128661 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:16.128644 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c\": container with ID starting with 454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c not found: ID does not exist" containerID="454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c" Apr 21 15:54:16.128705 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.128668 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c"} err="failed to get container status \"454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c\": rpc error: code = NotFound desc = could not find container \"454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c\": container with ID starting with 454d597f45660a57e521c1c3cef1f5f953b5f2fb74e753660f7733c0b884387c not found: ID does not exist" Apr 21 15:54:16.199223 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.199186 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34654724-da9c-44af-a61c-ed4883f3999d-kserve-provision-location\") pod \"34654724-da9c-44af-a61c-ed4883f3999d\" (UID: \"34654724-da9c-44af-a61c-ed4883f3999d\") " Apr 21 15:54:16.199518 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.199479 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34654724-da9c-44af-a61c-ed4883f3999d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34654724-da9c-44af-a61c-ed4883f3999d" (UID: "34654724-da9c-44af-a61c-ed4883f3999d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:16.300424 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.300390 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34654724-da9c-44af-a61c-ed4883f3999d-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:54:16.439684 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.439642 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv"] Apr 21 15:54:16.443948 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:16.443917 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-daad18-predictor-7d747cb478-h4pvv"] Apr 21 15:54:17.117980 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:17.117954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/0.log" Apr 21 15:54:17.118456 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:17.117989 2576 generic.go:358] "Generic (PLEG): container finished" podID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerID="5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4" exitCode=1 Apr 21 15:54:17.118456 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:17.118077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" event={"ID":"48d460f5-eb59-4977-a544-3c3a8107b8d0","Type":"ContainerDied","Data":"5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4"} Apr 21 15:54:17.803997 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:17.803952 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34654724-da9c-44af-a61c-ed4883f3999d" path="/var/lib/kubelet/pods/34654724-da9c-44af-a61c-ed4883f3999d/volumes" Apr 21 15:54:18.123999 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:18.123922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/0.log" Apr 21 15:54:18.124357 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:18.124029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" event={"ID":"48d460f5-eb59-4977-a544-3c3a8107b8d0","Type":"ContainerStarted","Data":"a870c26dd2a56c6d79b302e445f5b9457cddd4e06f5a95063f1803ee2212dd14"} Apr 21 15:54:21.139041 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.139012 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/1.log" Apr 21 15:54:21.139480 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.139387 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/0.log" Apr 21 15:54:21.139480 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.139424 2576 generic.go:358] "Generic (PLEG): container finished" podID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerID="a870c26dd2a56c6d79b302e445f5b9457cddd4e06f5a95063f1803ee2212dd14" exitCode=1 Apr 21 15:54:21.139586 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.139520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" event={"ID":"48d460f5-eb59-4977-a544-3c3a8107b8d0","Type":"ContainerDied","Data":"a870c26dd2a56c6d79b302e445f5b9457cddd4e06f5a95063f1803ee2212dd14"} Apr 21 15:54:21.139586 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.139569 2576 scope.go:117] "RemoveContainer" containerID="5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4" Apr 21 15:54:21.139950 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.139934 2576 scope.go:117] "RemoveContainer" containerID="5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4" Apr 21 15:54:21.150713 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:21.150687 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_kserve-ci-e2e-test_48d460f5-eb59-4977-a544-3c3a8107b8d0_0 in pod sandbox 4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359 from index: no such id: '5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4'" containerID="5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4" Apr 21 15:54:21.150776 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:21.150730 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_kserve-ci-e2e-test_48d460f5-eb59-4977-a544-3c3a8107b8d0_0 in pod sandbox 4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359 from index: no such id: '5f5bd63b177b6e0e8e3c689923a51d838ae62507e3fe94f13d394b477357f0f4'; Skipping pod \"isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_kserve-ci-e2e-test(48d460f5-eb59-4977-a544-3c3a8107b8d0)\"" logger="UnhandledError" Apr 21 15:54:21.152087 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:54:21.152066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_kserve-ci-e2e-test(48d460f5-eb59-4977-a544-3c3a8107b8d0)\"" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" Apr 21 15:54:21.227998 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.227961 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29"] Apr 21 15:54:21.368850 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.368818 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr"] Apr 21 15:54:21.369245 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369232 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369247 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369256 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369262 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369275 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369281 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369289 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369295 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369371 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerName="storage-initializer" Apr 21 15:54:21.369381 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369380 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="34654724-da9c-44af-a61c-ed4883f3999d" containerName="kserve-container" Apr 21 15:54:21.369757 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.369503 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f334e01d-09cc-425c-8ef3-0d3cbff4f175" containerName="storage-initializer" Apr 21 15:54:21.373610 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.373588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:54:21.376438 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.376414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xqtsr\"" Apr 21 15:54:21.379506 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.379470 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr"] Apr 21 15:54:21.554630 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.554598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f822518-dd23-4cc3-b26b-556414043e59-kserve-provision-location\") pod \"raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr\" (UID: \"8f822518-dd23-4cc3-b26b-556414043e59\") " pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:54:21.655879 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.655821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f822518-dd23-4cc3-b26b-556414043e59-kserve-provision-location\") pod \"raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr\" (UID: \"8f822518-dd23-4cc3-b26b-556414043e59\") " pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:54:21.656212 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.656192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f822518-dd23-4cc3-b26b-556414043e59-kserve-provision-location\") pod \"raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr\" (UID: \"8f822518-dd23-4cc3-b26b-556414043e59\") " pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:54:21.685043 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.685020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:54:21.813039 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:21.812903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr"] Apr 21 15:54:21.815799 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:54:21.815765 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f822518_dd23_4cc3_b26b_556414043e59.slice/crio-b838f0a57929c74c01b044294e47845179c7d69be89bd1e1c8c6ee7302d884c7 WatchSource:0}: Error finding container b838f0a57929c74c01b044294e47845179c7d69be89bd1e1c8c6ee7302d884c7: Status 404 returned error can't find the container with id b838f0a57929c74c01b044294e47845179c7d69be89bd1e1c8c6ee7302d884c7 Apr 21 15:54:22.144924 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.144838 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/1.log" Apr 21 15:54:22.146617 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.146590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" event={"ID":"8f822518-dd23-4cc3-b26b-556414043e59","Type":"ContainerStarted","Data":"5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7"} Apr 21 15:54:22.146740 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.146626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" event={"ID":"8f822518-dd23-4cc3-b26b-556414043e59","Type":"ContainerStarted","Data":"b838f0a57929c74c01b044294e47845179c7d69be89bd1e1c8c6ee7302d884c7"} Apr 21 15:54:22.272265 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.272242 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/1.log" Apr 21 15:54:22.272385 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.272307 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:22.361615 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.361531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48d460f5-eb59-4977-a544-3c3a8107b8d0-cabundle-cert\") pod \"48d460f5-eb59-4977-a544-3c3a8107b8d0\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " Apr 21 15:54:22.361757 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.361656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48d460f5-eb59-4977-a544-3c3a8107b8d0-kserve-provision-location\") pod \"48d460f5-eb59-4977-a544-3c3a8107b8d0\" (UID: \"48d460f5-eb59-4977-a544-3c3a8107b8d0\") " Apr 21 15:54:22.361950 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.361916 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48d460f5-eb59-4977-a544-3c3a8107b8d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "48d460f5-eb59-4977-a544-3c3a8107b8d0" (UID: "48d460f5-eb59-4977-a544-3c3a8107b8d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:22.362053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.361918 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48d460f5-eb59-4977-a544-3c3a8107b8d0-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "48d460f5-eb59-4977-a544-3c3a8107b8d0" (UID: "48d460f5-eb59-4977-a544-3c3a8107b8d0"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:54:22.462329 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.462243 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48d460f5-eb59-4977-a544-3c3a8107b8d0-cabundle-cert\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:54:22.462329 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:22.462274 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48d460f5-eb59-4977-a544-3c3a8107b8d0-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:54:23.152029 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.151998 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e57b3c-predictor-845664f98f-bsl29_48d460f5-eb59-4977-a544-3c3a8107b8d0/storage-initializer/1.log" Apr 21 15:54:23.152450 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.152133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" event={"ID":"48d460f5-eb59-4977-a544-3c3a8107b8d0","Type":"ContainerDied","Data":"4de542d6f71ad5bf18185f84dd5985b71ed0b3402a6b9c1b462d2dba13db4359"} Apr 21 15:54:23.152450 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.152181 2576 scope.go:117] "RemoveContainer" containerID="a870c26dd2a56c6d79b302e445f5b9457cddd4e06f5a95063f1803ee2212dd14" Apr 21 15:54:23.152450 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.152145 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29" Apr 21 15:54:23.198384 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.198344 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29"] Apr 21 15:54:23.203599 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.203569 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e57b3c-predictor-845664f98f-bsl29"] Apr 21 15:54:23.802248 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:23.802211 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" path="/var/lib/kubelet/pods/48d460f5-eb59-4977-a544-3c3a8107b8d0/volumes" Apr 21 15:54:26.164065 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:26.163971 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f822518-dd23-4cc3-b26b-556414043e59" containerID="5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7" exitCode=0 Apr 21 15:54:26.164065 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:26.164048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" event={"ID":"8f822518-dd23-4cc3-b26b-556414043e59","Type":"ContainerDied","Data":"5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7"} Apr 21 15:54:27.169383 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:27.169348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" event={"ID":"8f822518-dd23-4cc3-b26b-556414043e59","Type":"ContainerStarted","Data":"56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081"} Apr 21 15:54:27.169890 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:27.169642 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:54:27.171084 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:27.171056 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:54:27.189248 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:27.189199 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podStartSLOduration=6.189185539 podStartE2EDuration="6.189185539s" podCreationTimestamp="2026-04-21 15:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:54:27.187993155 +0000 UTC m=+1155.906070516" watchObservedRunningTime="2026-04-21 15:54:27.189185539 +0000 UTC m=+1155.907262887" Apr 21 15:54:28.173786 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:28.173750 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:54:38.173950 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:38.173895 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:54:48.174712 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:48.174666 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:54:58.174107 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:54:58.174062 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:55:08.174170 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:08.174113 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:55:11.807942 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:11.807895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:55:11.811685 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:11.811661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:55:18.174374 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:18.174320 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:55:28.174554 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:28.174477 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 15:55:36.799338 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:36.799288 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:55:41.626206 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8"] Apr 21 15:55:41.626638 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626561 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerName="storage-initializer" Apr 21 15:55:41.626638 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626573 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerName="storage-initializer" Apr 21 15:55:41.626714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626646 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerName="storage-initializer" Apr 21 15:55:41.626714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626701 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerName="storage-initializer" Apr 21 15:55:41.626714 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626706 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerName="storage-initializer" Apr 21 15:55:41.626810 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.626797 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="48d460f5-eb59-4977-a544-3c3a8107b8d0" containerName="storage-initializer" Apr 21 15:55:41.629920 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.629900 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:55:41.637519 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.637472 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8"] Apr 21 15:55:41.676445 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.676409 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr"] Apr 21 15:55:41.676691 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.676668 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" containerID="cri-o://56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081" gracePeriod=30 Apr 21 15:55:41.726669 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.726627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a315b444-99ef-4eae-b5b8-9ed2e0a95c56-kserve-provision-location\") pod \"raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8\" (UID: \"a315b444-99ef-4eae-b5b8-9ed2e0a95c56\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:55:41.827301 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.827265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a315b444-99ef-4eae-b5b8-9ed2e0a95c56-kserve-provision-location\") pod \"raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8\" (UID: \"a315b444-99ef-4eae-b5b8-9ed2e0a95c56\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:55:41.827698 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.827676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a315b444-99ef-4eae-b5b8-9ed2e0a95c56-kserve-provision-location\") pod \"raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8\" (UID: \"a315b444-99ef-4eae-b5b8-9ed2e0a95c56\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:55:41.940734 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:41.940644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:55:42.071788 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:42.071761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8"] Apr 21 15:55:42.074391 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:55:42.074364 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda315b444_99ef_4eae_b5b8_9ed2e0a95c56.slice/crio-f0911823352fe26fbd94546151eace92e37b11d6f2689ee6e65872efc8e89f64 WatchSource:0}: Error finding container f0911823352fe26fbd94546151eace92e37b11d6f2689ee6e65872efc8e89f64: Status 404 returned error can't find the container with id f0911823352fe26fbd94546151eace92e37b11d6f2689ee6e65872efc8e89f64 Apr 21 15:55:42.435261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:42.435217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" event={"ID":"a315b444-99ef-4eae-b5b8-9ed2e0a95c56","Type":"ContainerStarted","Data":"d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf"} Apr 21 15:55:42.435261 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:42.435265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" event={"ID":"a315b444-99ef-4eae-b5b8-9ed2e0a95c56","Type":"ContainerStarted","Data":"f0911823352fe26fbd94546151eace92e37b11d6f2689ee6e65872efc8e89f64"} Apr 21 15:55:46.449889 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:46.449849 2576 generic.go:358] "Generic (PLEG): container finished" podID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerID="d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf" exitCode=0 Apr 21 15:55:46.450296 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:46.449925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" event={"ID":"a315b444-99ef-4eae-b5b8-9ed2e0a95c56","Type":"ContainerDied","Data":"d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf"} Apr 21 15:55:46.637852 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:46.637830 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:55:46.669223 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:46.669134 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f822518-dd23-4cc3-b26b-556414043e59-kserve-provision-location\") pod \"8f822518-dd23-4cc3-b26b-556414043e59\" (UID: \"8f822518-dd23-4cc3-b26b-556414043e59\") " Apr 21 15:55:46.669667 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:46.669639 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f822518-dd23-4cc3-b26b-556414043e59-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f822518-dd23-4cc3-b26b-556414043e59" (UID: "8f822518-dd23-4cc3-b26b-556414043e59"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:55:46.770709 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:46.770674 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f822518-dd23-4cc3-b26b-556414043e59-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:55:47.461224 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.461185 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f822518-dd23-4cc3-b26b-556414043e59" containerID="56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081" exitCode=0 Apr 21 15:55:47.461729 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.461267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" event={"ID":"8f822518-dd23-4cc3-b26b-556414043e59","Type":"ContainerDied","Data":"56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081"} Apr 21 15:55:47.461729 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.461272 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" Apr 21 15:55:47.461729 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.461298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr" event={"ID":"8f822518-dd23-4cc3-b26b-556414043e59","Type":"ContainerDied","Data":"b838f0a57929c74c01b044294e47845179c7d69be89bd1e1c8c6ee7302d884c7"} Apr 21 15:55:47.461729 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.461314 2576 scope.go:117] "RemoveContainer" containerID="56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081" Apr 21 15:55:47.463092 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.463061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" event={"ID":"a315b444-99ef-4eae-b5b8-9ed2e0a95c56","Type":"ContainerStarted","Data":"a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6"} Apr 21 15:55:47.463414 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.463389 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:55:47.464983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.464960 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:55:47.470028 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.470011 2576 scope.go:117] "RemoveContainer" containerID="5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7" Apr 21 15:55:47.478314 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.478293 2576 scope.go:117] "RemoveContainer" containerID="56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081" Apr 21 15:55:47.478673 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:55:47.478652 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081\": container with ID starting with 56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081 not found: ID does not exist" containerID="56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081" Apr 21 15:55:47.478722 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.478682 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081"} err="failed to get container status \"56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081\": rpc error: code = NotFound desc = could not find container \"56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081\": container with ID starting with 56a01772387b5364d4768a986e164a9932dab0c8718ef4ca83cb6052188e0081 not found: ID does not exist" Apr 21 15:55:47.478722 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.478704 2576 scope.go:117] "RemoveContainer" containerID="5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7" Apr 21 15:55:47.478927 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:55:47.478913 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7\": container with ID starting with 5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7 not found: ID does not exist" containerID="5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7" Apr 21 15:55:47.478968 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.478931 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7"} err="failed to get container status \"5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7\": rpc error: code = NotFound desc = could not find container \"5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7\": container with ID starting with 5ca5842ea3a1438c077c030b65f5db7b72315ac8b736608dccd2d0af7aed60c7 not found: ID does not exist" Apr 21 15:55:47.489945 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.489896 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podStartSLOduration=6.489877132 podStartE2EDuration="6.489877132s" podCreationTimestamp="2026-04-21 15:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:55:47.488261447 +0000 UTC m=+1236.206338796" watchObservedRunningTime="2026-04-21 15:55:47.489877132 +0000 UTC m=+1236.207954470" Apr 21 15:55:47.508008 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.507971 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr"] Apr 21 15:55:47.511506 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.511465 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e5dc9-predictor-b76fc85f7-5tnlr"] Apr 21 15:55:47.802642 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:47.802609 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f822518-dd23-4cc3-b26b-556414043e59" path="/var/lib/kubelet/pods/8f822518-dd23-4cc3-b26b-556414043e59/volumes" Apr 21 15:55:48.468182 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:48.468145 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:55:58.469208 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:55:58.469106 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:56:08.468468 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:56:08.468424 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:56:18.468469 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:56:18.468419 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:56:28.469030 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:56:28.468946 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:56:38.468421 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:56:38.468378 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:56:48.469072 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:56:48.469025 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 21 15:56:57.802746 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:56:57.802716 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:57:01.885276 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:01.885237 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8"] Apr 21 15:57:01.885767 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:01.885595 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" containerID="cri-o://a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6" gracePeriod=30 Apr 21 15:57:06.540983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.540959 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:57:06.681824 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.681736 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a315b444-99ef-4eae-b5b8-9ed2e0a95c56-kserve-provision-location\") pod \"a315b444-99ef-4eae-b5b8-9ed2e0a95c56\" (UID: \"a315b444-99ef-4eae-b5b8-9ed2e0a95c56\") " Apr 21 15:57:06.682094 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.682071 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a315b444-99ef-4eae-b5b8-9ed2e0a95c56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a315b444-99ef-4eae-b5b8-9ed2e0a95c56" (UID: "a315b444-99ef-4eae-b5b8-9ed2e0a95c56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:57:06.741097 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.741057 2576 generic.go:358] "Generic (PLEG): container finished" podID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerID="a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6" exitCode=0 Apr 21 15:57:06.741305 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.741174 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" Apr 21 15:57:06.741305 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.741177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" event={"ID":"a315b444-99ef-4eae-b5b8-9ed2e0a95c56","Type":"ContainerDied","Data":"a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6"} Apr 21 15:57:06.741305 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.741219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8" event={"ID":"a315b444-99ef-4eae-b5b8-9ed2e0a95c56","Type":"ContainerDied","Data":"f0911823352fe26fbd94546151eace92e37b11d6f2689ee6e65872efc8e89f64"} Apr 21 15:57:06.741305 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.741235 2576 scope.go:117] "RemoveContainer" containerID="a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6" Apr 21 15:57:06.750009 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.749991 2576 scope.go:117] "RemoveContainer" containerID="d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf" Apr 21 15:57:06.757621 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.757599 2576 scope.go:117] "RemoveContainer" containerID="a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6" Apr 21 15:57:06.757943 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:57:06.757925 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6\": container with ID starting with a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6 not found: ID does not exist" containerID="a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6" Apr 21 15:57:06.758005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.757952 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6"} err="failed to get container status \"a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6\": rpc error: code = NotFound desc = could not find container \"a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6\": container with ID starting with a2c3676fbeb861ce85f23249a895df9bb0e216c7912839c1305224a97da51de6 not found: ID does not exist" Apr 21 15:57:06.758005 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.757970 2576 scope.go:117] "RemoveContainer" containerID="d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf" Apr 21 15:57:06.758187 ip-10-0-136-162 kubenswrapper[2576]: E0421 15:57:06.758173 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf\": container with ID starting with d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf not found: ID does not exist" containerID="d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf" Apr 21 15:57:06.758226 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.758192 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf"} err="failed to get container status \"d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf\": rpc error: code = NotFound desc = could not find container \"d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf\": container with ID starting with d7f5f32da9efe6a8a98963954732873e75e21b5fdb6a620544e13aef3e8f8baf not found: ID does not exist" Apr 21 15:57:06.764424 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.764391 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8"] Apr 21 15:57:06.767008 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.766984 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-06bac-predictor-6f4db74cd7-hc5h8"] Apr 21 15:57:06.783428 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:06.783394 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a315b444-99ef-4eae-b5b8-9ed2e0a95c56-kserve-provision-location\") on node \"ip-10-0-136-162.ec2.internal\" DevicePath \"\"" Apr 21 15:57:07.801746 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:07.801712 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" path="/var/lib/kubelet/pods/a315b444-99ef-4eae-b5b8-9ed2e0a95c56/volumes" Apr 21 15:57:30.532917 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:30.532825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6f2r4_3a193bd2-d4b3-409b-a943-668e1838d610/global-pull-secret-syncer/0.log" Apr 21 15:57:30.663536 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:30.663499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7vpp5_dde473f9-f7fd-43ae-b044-2347e8649fee/konnectivity-agent/0.log" Apr 21 15:57:30.821865 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:30.821831 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-162.ec2.internal_2ad89fa060fba0be0166e35605272b1a/haproxy/0.log" Apr 21 15:57:33.947670 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:33.947642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-ttq57_f63638af-e18a-4727-b301-3061e1e187b2/cluster-monitoring-operator/0.log" Apr 21 15:57:33.976809 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:33.976780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-blplp_e7b09aa4-22f8-4fa3-b7cb-08b261441aff/kube-state-metrics/0.log" Apr 21 15:57:34.009272 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:34.009242 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-blplp_e7b09aa4-22f8-4fa3-b7cb-08b261441aff/kube-rbac-proxy-main/0.log" Apr 21 15:57:34.056528 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:34.056473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-blplp_e7b09aa4-22f8-4fa3-b7cb-08b261441aff/kube-rbac-proxy-self/0.log" Apr 21 15:57:34.524056 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:34.524025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tzxlg_d9c8569a-9c0f-45ef-b5b3-6d6893c96750/node-exporter/0.log" Apr 21 15:57:34.575897 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:34.575865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tzxlg_d9c8569a-9c0f-45ef-b5b3-6d6893c96750/kube-rbac-proxy/0.log" Apr 21 15:57:34.645960 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:34.645928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tzxlg_d9c8569a-9c0f-45ef-b5b3-6d6893c96750/init-textfile/0.log" Apr 21 15:57:35.465650 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:35.465543 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zzs27_7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9/prometheus-operator/0.log" Apr 21 15:57:35.553173 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:35.553146 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zzs27_7a8e0d7d-a3b6-4a7d-91d5-b2d10859b1f9/kube-rbac-proxy/0.log" Apr 21 15:57:35.731299 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:35.731204 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9ntvk_a1c5802a-6473-41b2-bc16-71d347b210f5/prometheus-operator-admission-webhook/0.log" Apr 21 15:57:36.076948 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.076911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c7748f595-45kkr_81d61fd2-e12a-4415-9e98-d8d556cb75f4/thanos-query/0.log" Apr 21 15:57:36.129314 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.129281 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c7748f595-45kkr_81d61fd2-e12a-4415-9e98-d8d556cb75f4/kube-rbac-proxy-web/0.log" Apr 21 15:57:36.161891 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.161858 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c7748f595-45kkr_81d61fd2-e12a-4415-9e98-d8d556cb75f4/kube-rbac-proxy/0.log" Apr 21 15:57:36.193983 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.193955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c7748f595-45kkr_81d61fd2-e12a-4415-9e98-d8d556cb75f4/prom-label-proxy/0.log" Apr 21 15:57:36.237328 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.237300 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c7748f595-45kkr_81d61fd2-e12a-4415-9e98-d8d556cb75f4/kube-rbac-proxy-rules/0.log" Apr 21 15:57:36.279330 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.279305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c7748f595-45kkr_81d61fd2-e12a-4415-9e98-d8d556cb75f4/kube-rbac-proxy-metrics/0.log" Apr 21 15:57:36.971678 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.971640 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz"] Apr 21 15:57:36.972053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972017 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" Apr 21 15:57:36.972053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972029 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" Apr 21 15:57:36.972053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972047 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="storage-initializer" Apr 21 15:57:36.972053 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972053 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="storage-initializer" Apr 21 15:57:36.972195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972067 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" Apr 21 15:57:36.972195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972073 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" Apr 21 15:57:36.972195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972081 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="storage-initializer" Apr 21 15:57:36.972195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972086 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="storage-initializer" Apr 21 15:57:36.972195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972144 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a315b444-99ef-4eae-b5b8-9ed2e0a95c56" containerName="kserve-container" Apr 21 15:57:36.972195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.972154 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f822518-dd23-4cc3-b26b-556414043e59" containerName="kserve-container" Apr 21 15:57:36.975275 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.975259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:36.978105 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.978083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7pvkx\"/\"openshift-service-ca.crt\"" Apr 21 15:57:36.978229 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.978086 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7pvkx\"/\"kube-root-ca.crt\"" Apr 21 15:57:36.979274 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.979242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7pvkx\"/\"default-dockercfg-mgs4b\"" Apr 21 15:57:36.981425 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:36.981407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz"] Apr 21 15:57:37.051032 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.051000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rldx\" (UniqueName: \"kubernetes.io/projected/362e88ab-42e0-4dec-8585-bf03056b4d27-kube-api-access-5rldx\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.051032 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.051034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-sys\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.051273 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.051093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-podres\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.051273 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.051111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-lib-modules\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.051273 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.051177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-proc\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152329 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-podres\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152329 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-lib-modules\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-proc\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-podres\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rldx\" (UniqueName: \"kubernetes.io/projected/362e88ab-42e0-4dec-8585-bf03056b4d27-kube-api-access-5rldx\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-lib-modules\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-sys\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152608 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-proc\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.152828 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.152632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/362e88ab-42e0-4dec-8585-bf03056b4d27-sys\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.162291 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.162261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rldx\" (UniqueName: \"kubernetes.io/projected/362e88ab-42e0-4dec-8585-bf03056b4d27-kube-api-access-5rldx\") pod \"perf-node-gather-daemonset-krzpz\" (UID: \"362e88ab-42e0-4dec-8585-bf03056b4d27\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.198285 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.198253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ndqn6_b8784f5d-7f15-4691-ba7d-539cda706701/networking-console-plugin/0.log" Apr 21 15:57:37.285727 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.285631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.425419 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.425395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz"] Apr 21 15:57:37.428139 ip-10-0-136-162 kubenswrapper[2576]: W0421 15:57:37.428100 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod362e88ab_42e0_4dec_8585_bf03056b4d27.slice/crio-fc1e8b720f832e305527be7a5acf57847ea0233477dfd37a80180f541dde3079 WatchSource:0}: Error finding container fc1e8b720f832e305527be7a5acf57847ea0233477dfd37a80180f541dde3079: Status 404 returned error can't find the container with id fc1e8b720f832e305527be7a5acf57847ea0233477dfd37a80180f541dde3079 Apr 21 15:57:37.429701 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.429682 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:57:37.695154 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.695122 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/1.log" Apr 21 15:57:37.704503 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.704468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dv5m4_e203aed0-40fa-4049-8152-8cb9d29fe09e/console-operator/2.log" Apr 21 15:57:37.848665 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.848633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" event={"ID":"362e88ab-42e0-4dec-8585-bf03056b4d27","Type":"ContainerStarted","Data":"9193300d05cf8cf5665e9659895454946bcbe460c1f66798cc206e75f1a98906"} Apr 21 15:57:37.848665 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.848668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" event={"ID":"362e88ab-42e0-4dec-8585-bf03056b4d27","Type":"ContainerStarted","Data":"fc1e8b720f832e305527be7a5acf57847ea0233477dfd37a80180f541dde3079"} Apr 21 15:57:37.848892 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.848776 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:37.873099 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:37.873049 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" podStartSLOduration=1.873035072 podStartE2EDuration="1.873035072s" podCreationTimestamp="2026-04-21 15:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:57:37.872557731 +0000 UTC m=+1346.590635082" watchObservedRunningTime="2026-04-21 15:57:37.873035072 +0000 UTC m=+1346.591112420" Apr 21 15:57:38.174387 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:38.174360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56bc4fbb44-m7gsp_1ebb80ef-7217-42d0-9955-a96caaee251b/console/0.log" Apr 21 15:57:38.728248 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:38.728216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-926t8_5e975175-9472-4f4d-ac64-96a287811fe5/volume-data-source-validator/0.log" Apr 21 15:57:39.733713 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:39.733684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9wdl5_e7e9ca3a-1238-49eb-be83-c342ccbacce4/dns/0.log" Apr 21 15:57:39.777352 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:39.777316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9wdl5_e7e9ca3a-1238-49eb-be83-c342ccbacce4/kube-rbac-proxy/0.log" Apr 21 15:57:40.030469 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:40.030360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wl2ck_e7197e20-bb57-4167-b435-1446351d6727/dns-node-resolver/0.log" Apr 21 15:57:40.694875 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:40.694844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b59t6_785c95fa-5c55-4ea7-8b31-adcc8f22c2e2/node-ca/0.log" Apr 21 15:57:42.066076 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:42.066047 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n554q_94cdcb75-41df-488a-9f65-1dcac041f00e/serve-healthcheck-canary/0.log" Apr 21 15:57:42.597248 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:42.597209 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-75lvm_2f94a9e6-1b72-42ce-8590-58a4cdf199ab/insights-operator/0.log" Apr 21 15:57:42.625004 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:42.624973 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5z45k_19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed/kube-rbac-proxy/0.log" Apr 21 15:57:42.653195 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:42.653163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5z45k_19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed/exporter/0.log" Apr 21 15:57:42.683319 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:42.683285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5z45k_19bbbe8b-bd00-4795-ab06-e9b9e3cb2eed/extractor/0.log" Apr 21 15:57:43.862397 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:43.862370 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-krzpz" Apr 21 15:57:45.002579 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:45.002550 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-675zd_7e967817-d37c-4c44-971c-3512adb8603d/server/0.log" Apr 21 15:57:45.136180 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:45.136143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-zxk7h_c75281e7-f4b2-4753-91b0-031da738e963/manager/0.log" Apr 21 15:57:45.265334 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:45.265248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-ngqsk_70dc397b-75b1-4c2c-9810-89227545be76/seaweedfs/0.log" Apr 21 15:57:50.432824 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:50.432769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fcfg7_8287ddd5-c147-400c-b1e7-382801765df6/kube-storage-version-migrator-operator/1.log" Apr 21 15:57:50.434646 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:50.434621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fcfg7_8287ddd5-c147-400c-b1e7-382801765df6/kube-storage-version-migrator-operator/0.log" Apr 21 15:57:51.450287 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.450253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7d5rq_97b39e40-65e5-492e-a61c-fbdc3987eeb5/kube-multus/0.log" Apr 21 15:57:51.799413 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.799387 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/kube-multus-additional-cni-plugins/0.log" Apr 21 15:57:51.843724 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.843693 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/egress-router-binary-copy/0.log" Apr 21 15:57:51.882436 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.882407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/cni-plugins/0.log" Apr 21 15:57:51.918162 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.918134 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/bond-cni-plugin/0.log" Apr 21 15:57:51.958703 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.958668 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/routeoverride-cni/0.log" Apr 21 15:57:51.993743 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:51.993714 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/whereabouts-cni-bincopy/0.log" Apr 21 15:57:52.024741 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:52.024707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmb6k_eb648a31-68d5-41f6-8194-806717864579/whereabouts-cni/0.log" Apr 21 15:57:52.360472 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:52.360435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sq5ln_8b870a2e-b786-497a-8ee3-57668a43f22d/network-metrics-daemon/0.log" Apr 21 15:57:52.398054 ip-10-0-136-162 kubenswrapper[2576]: I0421 15:57:52.398011 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sq5ln_8b870a2e-b786-497a-8ee3-57668a43f22d/kube-rbac-proxy/0.log"