Apr 22 18:46:52.373596 ip-10-0-143-56 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:52.746580 ip-10-0-143-56 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:52.746580 ip-10-0-143-56 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:52.746580 ip-10-0-143-56 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:52.746580 ip-10-0-143-56 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:52.746580 ip-10-0-143-56 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:52.747937 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.747849 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:52.750144 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750128 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:52.750144 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750145 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750149 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750153 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750156 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750159 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750162 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750166 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750170 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750172 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750175 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750178 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750182 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750186 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750188 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750191 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750194 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750197 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750200 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750203 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750206 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:52.750211 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750209 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750212 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750215 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750218 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750221 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750224 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750227 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750230 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750233 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750235 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750238 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750240 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750243 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750247 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750251 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750254 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750257 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750259 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750262 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750277 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:52.750707 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750280 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750285 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750289 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750292 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750295 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750298 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750301 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750304 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750306 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750310 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750312 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750315 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750318 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750320 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750324 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750327 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750330 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750332 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750335 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:52.751252 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750338 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750341 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750344 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750346 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750349 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750352 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750355 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750358 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750361 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750370 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750373 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750375 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750378 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750381 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750384 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750387 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750390 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750392 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750395 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750398 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:52.751741 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750401 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750405 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750408 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750411 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750413 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750416 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750807 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750813 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750816 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750819 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750822 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750824 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750827 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750830 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750833 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750835 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750838 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750841 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750844 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:52.752226 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750847 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750850 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750852 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750855 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750859 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750862 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750865 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750868 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750870 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750874 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750877 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750880 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750883 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750886 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750888 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750891 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750894 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750896 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750900 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:52.752693 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750902 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750906 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750908 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750911 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750914 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750917 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750919 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750922 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750925 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750927 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750930 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750932 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750935 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750938 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750941 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750943 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750946 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750948 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750950 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750953 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:52.753165 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750955 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750958 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750960 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750963 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750965 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750968 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750970 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750973 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750975 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750978 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750981 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750983 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750986 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750989 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750993 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750996 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.750999 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751001 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751004 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751006 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:52.753714 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751009 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751012 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751014 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751016 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751019 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751021 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751024 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751027 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751029 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751032 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751034 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751037 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751039 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751042 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751118 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751126 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751132 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751137 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751141 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751145 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751149 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:52.754207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751154 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751157 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751160 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751164 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751173 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751177 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751180 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751183 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751186 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751189 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751192 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751194 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751199 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751202 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751205 2577 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751208 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751211 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751215 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751219 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751222 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751225 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751228 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751231 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751234 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751237 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:52.754735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751240 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751244 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751247 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751250 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751253 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751256 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751259 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751264 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751285 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751289 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751294 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751301 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751305 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751309 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751312 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751315 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751318 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751321 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751324 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751327 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751330 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751333 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751336 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751340 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751343 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:52.755356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751346 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751350 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751353 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751356 2577 flags.go:64] FLAG: --help="false" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751359 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-143-56.ec2.internal" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751362 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751365 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751368 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751371 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751375 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751378 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751380 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751383 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751386 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751389 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751392 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751394 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751397 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751401 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751405 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751408 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751411 2577 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751414 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751417 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:52.755950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751420 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751425 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751428 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751431 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751434 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751437 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751441 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751444 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751447 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751452 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751455 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751460 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751463 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751466 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751469 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751472 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751475 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751478 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751481 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751488 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751491 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751494 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751497 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:52.756560 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751500 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751506 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751509 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751512 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751516 2577 flags.go:64] FLAG: --port="10250" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751519 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751522 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bc854e68feaa2445" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751525 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751528 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751531 2577 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751534 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751537 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751541 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751544 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751547 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751550 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751553 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751556 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751560 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751562 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751566 2577 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751568 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751572 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751575 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751577 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751580 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:52.757118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751583 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751586 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751589 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751592 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751595 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751597 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751600 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751604 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751607 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751610 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751616 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751619 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751622 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751626 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751629 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751632 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751635 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751638 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751641 2577 flags.go:64] FLAG: --v="2" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751645 2577 flags.go:64] FLAG: --version="false" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751649 2577 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751654 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.751657 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751758 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:52.757843 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751763 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751767 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751770 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751773 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751776 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751779 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751781 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751784 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751787 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751789 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751792 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751794 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751797 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751800 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751803 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751805 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751808 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751812 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751817 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751820 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:52.758436 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751823 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751826 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751829 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751832 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751834 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751837 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751840 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751842 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751845 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751847 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751850 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751852 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751855 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751858 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751860 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751863 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751866 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751868 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751870 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751873 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:52.758973 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751876 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751879 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751881 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751883 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751886 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751888 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751890 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751893 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751895 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751899 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751904 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751906 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751909 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751912 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751914 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751916 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751919 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751922 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751924 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:52.759475 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751927 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751929 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751932 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751934 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751937 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751939 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751942 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751945 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751947 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751950 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751952 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751955 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751958 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751960 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751963 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751965 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751968 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751971 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751973 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751976 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:52.759951 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751978 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751981 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751985 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751989 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751991 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.751994 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.752006 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.759187 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.759317 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759368 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759373 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759376 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759380 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759384 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759388 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:52.760461 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759393 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759396 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759398 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759401 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759405 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759407 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759410 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759413 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759416 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759418 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759421 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759424 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759426 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759429 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759431 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759434 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759437 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759439 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759442 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759444 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:52.760841 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759447 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759450 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759452 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759455 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759458 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759463 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759468 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759472 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759475 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759478 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759482 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759485 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759488 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759491 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759494 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759497 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759500 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759502 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759505 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759508 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:52.761338 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759510 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759513 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759516 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759518 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759521 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759524 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759526 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759528 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759531 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759534 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759536 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759539 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759542 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759544 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759547 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759550 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759553 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759556 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759559 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759562 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:52.761844 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759564 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759567 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759569 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759572 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759575 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759578 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759580 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759583 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759586 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759588 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759591 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759593 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759596 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759598 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759602 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759604 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759607 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759610 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759612 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:52.762355 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.759615 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.759620 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760075 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760081 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760085 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760088 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760091 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760094 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760097 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760100 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760103 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760106 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760112 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760115 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760118 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760120 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:52.762847 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760123 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760125 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760128 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760131 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760133 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760136 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760139 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760141 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760143 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760146 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760149 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760151 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760154 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760156 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760158 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760161 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760163 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760166 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760169 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760171 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:52.763243 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760174 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760177 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760179 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760182 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760185 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760187 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760190 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760193 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760195 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760198 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760201 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760203 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760206 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760208 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760211 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760213 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760216 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760218 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760221 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:52.763751 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760223 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760226 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760228 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760232 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760236 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760240 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760244 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760247 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760249 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760252 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760255 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760258 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760260 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760264 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760281 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760284 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760287 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760289 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760292 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:52.764200 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760294 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760297 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760300 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760303 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760306 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760308 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760311 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760314 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760316 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760319 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760321 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760324 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760326 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:52.760329 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.760334 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:52.764828 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.761164 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:52.765196 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.763142 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:52.765196 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.763949 2577 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:52.765196 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.764047 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:52.765196 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.764676 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:52.787834 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.787808 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:52.790235 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.790214 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:52.800686 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.800665 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:52.808364 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.808345 2577 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:52.810313 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.810298 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:52.812987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.812963 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8340c2b5-15d5-4b89-a280-8ac77a1a6184:/dev/nvme0n1p3 dbd21649-3304-415e-9527-877cc7e7bfd4:/dev/nvme0n1p4] Apr 22 18:46:52.813050 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.812987 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:52.818261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.818147 2577 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:52.817128869 +0000 UTC m=+0.341792412 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100284 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec292ea72318b240d66a1c4ac03ba09f SystemUUID:ec292ea7-2318-b240-d66a-1c4ac03ba09f BootID:3cb6e454-2b9f-4d16-9a3b-2d2c0baea6af Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:03:0d:dc:ff:67 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:03:0d:dc:ff:67 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:a6:d1:56:f4:cc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:52.818261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.818251 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:52.818411 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.818398 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:52.819916 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.819888 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:52.819982 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.819967 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:52.820117 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.819919 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-56.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:52.820163 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.820130 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:52.820163 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.820139 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:52.820163 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.820153 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:52.820755 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.820744 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:52.822198 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.822186 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:52.822349 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.822339 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:52.824193 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.824183 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:52.824230 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.824202 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:52.824230 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.824214 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:52.824230 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.824224 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:52.824403 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.824232 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:52.826754 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.826731 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:52.826859 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.826763 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:52.829987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.829968 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:52.831485 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.831471 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:52.833105 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833089 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833110 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833119 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833124 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833131 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833136 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833142 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833148 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833155 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833161 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833169 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833178 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833204 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:52.833233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.833210 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:52.834827 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.834809 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-56.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:52.834959 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.834887 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-56.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:52.835004 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.834976 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:52.836890 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.836867 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fbgkm" Apr 22 18:46:52.837016 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.837005 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:52.837048 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.837040 2577 server.go:1295] "Started kubelet" Apr 22 18:46:52.837144 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.837120 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:52.837195 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.837133 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:52.837243 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.837206 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:52.837929 ip-10-0-143-56 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:52.838319 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.838302 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:52.839585 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.839569 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:52.843991 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.843970 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:52.844506 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.844493 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:52.844700 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.844681 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fbgkm" Apr 22 18:46:52.844890 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.844099 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-56.ec2.internal.18a8c23b465cf321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-56.ec2.internal,UID:ip-10-0-143-56.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-56.ec2.internal,},FirstTimestamp:2026-04-22 18:46:52.837016353 +0000 UTC m=+0.361679894,LastTimestamp:2026-04-22 18:46:52.837016353 +0000 UTC m=+0.361679894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-56.ec2.internal,}" Apr 22 18:46:52.845190 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845166 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:52.845190 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845167 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:52.845335 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845198 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:52.845335 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.845322 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:52.845945 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845898 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:52.845945 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845911 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:52.845945 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845916 2577 factory.go:55] Registering systemd factory Apr 22 18:46:52.845945 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.845938 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:52.846282 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.846256 2577 factory.go:153] Registering CRI-O factory Apr 22 18:46:52.846357 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.846285 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:52.846357 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.846341 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:52.846462 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.846368 2577 factory.go:103] Registering Raw factory Apr 22 18:46:52.846462 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.846383 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:52.846774 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.846761 2577 manager.go:319] Starting recovery of all containers Apr 22 18:46:52.848100 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.848066 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:52.852008 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.851988 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:52.855952 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.855927 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-56.ec2.internal\" not found" node="ip-10-0-143-56.ec2.internal" Apr 22 18:46:52.857812 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.857796 2577 manager.go:324] Recovery completed Apr 22 18:46:52.861827 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.861815 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:52.865790 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.865775 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:52.865871 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.865809 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:52.865871 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.865825 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:52.866345 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.866329 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:52.866345 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.866342 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:52.866460 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.866360 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:52.869732 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.869717 2577 policy_none.go:49] "None policy: Start" Apr 22 18:46:52.869732 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.869733 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:52.869797 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.869742 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:52.912911 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.912884 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:52.913020 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.912965 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:52.913020 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.912980 2577 server.go:85] "Starting device plugin registration server" Apr 22 18:46:52.913250 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.913236 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:52.913328 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.913251 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:52.913537 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.913395 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:52.913537 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.913472 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:52.913537 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.913480 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:52.914532 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.914325 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:52.914532 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.914370 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:52.984454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.984410 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:52.985668 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.985643 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:52.985769 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.985681 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:52.985769 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.985706 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:52.985769 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.985715 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:52.985906 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:52.985808 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:52.988398 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:52.988376 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:53.013642 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.013583 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:53.014447 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.014431 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:53.014537 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.014467 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:53.014537 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.014483 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:53.014537 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.014512 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.020670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.020655 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.020727 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.020676 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-56.ec2.internal\": node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.047013 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.046988 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.086096 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.086063 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal"] Apr 22 18:46:53.086152 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.086143 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:53.087052 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.087037 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:53.087138 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.087063 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:53.087138 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.087073 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:53.089437 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.089425 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:53.089574 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.089559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.089620 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.089587 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:53.090102 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.090090 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:53.090102 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.090097 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:53.090227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.090112 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:53.090227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.090123 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:53.090227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.090126 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:53.090227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.090139 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:53.092473 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.092455 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.092518 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.092489 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:53.093223 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.093207 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:53.093316 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.093237 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:53.093316 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.093251 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:53.116960 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.116933 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-56.ec2.internal\" not found" node="ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.121451 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.121435 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-56.ec2.internal\" not found" node="ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.147767 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.147743 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.147870 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.147836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9853c0f8048e7a952da729afa0e9e9f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal\" (UID: \"9853c0f8048e7a952da729afa0e9e9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.147910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.147870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9853c0f8048e7a952da729afa0e9e9f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal\" (UID: \"9853c0f8048e7a952da729afa0e9e9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.147910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.147890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/00b417b51c5d26c97b6e66b7df9d6ed9-config\") pod \"kube-apiserver-proxy-ip-10-0-143-56.ec2.internal\" (UID: \"00b417b51c5d26c97b6e66b7df9d6ed9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.248750 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.248713 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.248863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.248802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9853c0f8048e7a952da729afa0e9e9f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal\" (UID: \"9853c0f8048e7a952da729afa0e9e9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.248863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.248833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9853c0f8048e7a952da729afa0e9e9f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal\" (UID: \"9853c0f8048e7a952da729afa0e9e9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.248863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.248854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/00b417b51c5d26c97b6e66b7df9d6ed9-config\") pod \"kube-apiserver-proxy-ip-10-0-143-56.ec2.internal\" (UID: \"00b417b51c5d26c97b6e66b7df9d6ed9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.248957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.248884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/00b417b51c5d26c97b6e66b7df9d6ed9-config\") pod \"kube-apiserver-proxy-ip-10-0-143-56.ec2.internal\" (UID: \"00b417b51c5d26c97b6e66b7df9d6ed9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.248957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.248897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9853c0f8048e7a952da729afa0e9e9f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal\" (UID: \"9853c0f8048e7a952da729afa0e9e9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.248957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.248897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9853c0f8048e7a952da729afa0e9e9f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal\" (UID: \"9853c0f8048e7a952da729afa0e9e9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.349454 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.349376 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.418598 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.418568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.424356 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.424335 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" Apr 22 18:46:53.450081 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.450054 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.550477 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.550438 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.650896 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.650815 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.751283 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.751243 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.764443 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.764414 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:53.764640 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.764620 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:53.764698 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.764626 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:53.844419 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.844389 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:53.849104 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.849080 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:52 +0000 UTC" deadline="2028-02-01 02:23:52.840758062 +0000 UTC" Apr 22 18:46:53.849104 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.849102 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15583h36m58.991658343s" Apr 22 18:46:53.851797 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.851778 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:53.854080 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.854063 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:53.879211 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.879196 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xntfr" Apr 22 18:46:53.884800 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:53.884781 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xntfr" Apr 22 18:46:53.952456 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:53.952393 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:54.052887 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.052856 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:54.076667 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:54.076441 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b417b51c5d26c97b6e66b7df9d6ed9.slice/crio-329eed3923bd5833c1ecaed5e5c91c82a1452de129a2175259da0d7e633d059d WatchSource:0}: Error finding container 329eed3923bd5833c1ecaed5e5c91c82a1452de129a2175259da0d7e633d059d: Status 404 returned error can't find the container with id 329eed3923bd5833c1ecaed5e5c91c82a1452de129a2175259da0d7e633d059d Apr 22 18:46:54.076961 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:54.076942 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9853c0f8048e7a952da729afa0e9e9f0.slice/crio-83be59299049e7fd08fd87e288951b7bfb1df43c4de076ce48aaaf44f52d6590 WatchSource:0}: Error finding container 83be59299049e7fd08fd87e288951b7bfb1df43c4de076ce48aaaf44f52d6590: Status 404 returned error can't find the container with id 83be59299049e7fd08fd87e288951b7bfb1df43c4de076ce48aaaf44f52d6590 Apr 22 18:46:54.082621 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.082607 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:54.153024 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.152977 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:54.203167 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.203111 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:54.253584 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.253539 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-56.ec2.internal\" not found" Apr 22 18:46:54.351396 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.351372 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:54.444957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.444922 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" Apr 22 18:46:54.459313 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.459238 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:54.459938 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.459926 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" Apr 22 18:46:54.468427 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.468411 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:54.825053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.824980 2577 apiserver.go:52] "Watching apiserver" Apr 22 18:46:54.832634 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.832600 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:54.833135 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.833093 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-k85gs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal","openshift-multus/multus-7frpv","openshift-multus/network-metrics-daemon-4q2cb","openshift-network-operator/iptables-alerter-n7nbq","openshift-ovn-kubernetes/ovnkube-node-7zbwn","kube-system/konnectivity-agent-zttd6","kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4","openshift-cluster-node-tuning-operator/tuned-4h9jv","openshift-image-registry/node-ca-5t6hp","openshift-multus/multus-additional-cni-plugins-hm5p8","openshift-network-diagnostics/network-check-target-h4b8s"] Apr 22 18:46:54.838352 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.838307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.840762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.840746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:54.840864 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.840814 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:46:54.841061 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.841036 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.841281 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.841251 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:54.841865 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.841373 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qqhr8\"" Apr 22 18:46:54.841865 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.841257 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:54.843363 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.842853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.845178 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.844952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.847113 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.847093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.847315 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.847296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.848726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.847839 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jr8wm\"" Apr 22 18:46:54.848726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.847933 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.848726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.847941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.848726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.847941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:54.849854 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.849184 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:54.849854 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.849523 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:54.849854 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.849719 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rvxv5\"" Apr 22 18:46:54.851986 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.851382 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:54.851986 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.851733 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:54.853428 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.852248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.853428 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.852506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.853428 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.852599 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:54.853428 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.852726 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:54.854072 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.854056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.856107 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.854588 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:54.856107 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.854614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ql9v8\"" Apr 22 18:46:54.856107 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.856050 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8kt4h\"" Apr 22 18:46:54.856573 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.856464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.857428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.857756 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.857975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-kubelet\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d44f31c4-687d-42a9-b6c4-230bd1f85d32-konnectivity-ca\") pod \"konnectivity-agent-zttd6\" (UID: \"d44f31c4-687d-42a9-b6c4-230bd1f85d32\") " pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-system-cni-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-multus-certs\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-node-log\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.858142 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-cni-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-os-release\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-socket-dir-parent\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-cni-bin\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-etc-kubernetes\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422cs\" (UniqueName: \"kubernetes.io/projected/7b10df83-fc14-4a92-9ad2-800fbd71b62e-kube-api-access-422cs\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72313447-1a66-4fcb-905e-c46123a74148-host-slash\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-ovn\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-k8s-cni-cncf-io\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-kubelet\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbkmj\" (UniqueName: \"kubernetes.io/projected/c33a8222-6663-4971-9e27-d05681becacf-kube-api-access-vbkmj\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858504 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-var-lib-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-cni-netd\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858571 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-ovnkube-script-lib\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-cnibin\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-cni-multus\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858643 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:54.858670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-systemd\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-etc-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-env-overrides\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12756ff-2896-467a-b08f-4d4ca991e872-ovn-node-metrics-cert\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b10df83-fc14-4a92-9ad2-800fbd71b62e-cni-binary-copy\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-netns\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-hostroot\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-slash\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858867 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-run-netns\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-conf-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-daemon-config\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfrv\" (UniqueName: \"kubernetes.io/projected/72313447-1a66-4fcb-905e-c46123a74148-kube-api-access-ltfrv\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-cni-bin\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-ovnkube-config\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.859454 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgq4m\" (UniqueName: \"kubernetes.io/projected/e12756ff-2896-467a-b08f-4d4ca991e872-kube-api-access-kgq4m\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859068 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d44f31c4-687d-42a9-b6c4-230bd1f85d32-agent-certs\") pod \"konnectivity-agent-zttd6\" (UID: \"d44f31c4-687d-42a9-b6c4-230bd1f85d32\") " pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72313447-1a66-4fcb-905e-c46123a74148-iptables-alerter-script\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-systemd-units\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-log-socket\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-run-ovn-kubernetes\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.858572 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859663 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xtggb\"" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859885 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.859907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v9xh8\"" Apr 22 18:46:54.860218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.860085 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.861963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.861780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:54.861963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.861888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:54.861963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.861902 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.862384 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.861951 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:46:54.865717 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.865698 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:54.865792 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.865732 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:54.865844 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.865828 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:54.866006 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.865886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:54.866006 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.865994 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-cgt5g\"" Apr 22 18:46:54.866152 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.866057 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:54.867096 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.867048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kxtjg\"" Apr 22 18:46:54.887368 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.887338 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:53 +0000 UTC" deadline="2027-10-03 10:24:51.705766314 +0000 UTC" Apr 22 18:46:54.887368 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.887369 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12687h37m56.818401325s" Apr 22 18:46:54.946452 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.946426 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:54.959695 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72313447-1a66-4fcb-905e-c46123a74148-iptables-alerter-script\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.959840 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959705 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-systemd-units\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.959840 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-systemd-units\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.959840 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-log-socket\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.959840 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-device-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysconfig\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-tuned\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-log-socket\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-kubelet\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.959981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d44f31c4-687d-42a9-b6c4-230bd1f85d32-konnectivity-ca\") pod \"konnectivity-agent-zttd6\" (UID: \"d44f31c4-687d-42a9-b6c4-230bd1f85d32\") " pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-node-log\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-etc-selinux\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.960053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-tmp-dir\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-cni-bin\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.960087 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-kubelet\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-etc-kubernetes\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-etc-kubernetes\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:54.960170 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:55.460135597 +0000 UTC m=+2.984799137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-node-log\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-ovn\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-ovnkube-script-lib\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-socket-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnh2v\" (UniqueName: \"kubernetes.io/projected/30ee9556-6a92-4520-a931-8d8ab472a6b5-kube-api-access-nnh2v\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv48p\" (UniqueName: \"kubernetes.io/projected/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-kube-api-access-kv48p\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-k8s-cni-cncf-io\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-kubelet\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-var-lib-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-cni-netd\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.960496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-k8s-cni-cncf-io\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-kubelet\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-var-lib-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-cni-netd\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-etc-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-env-overrides\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12756ff-2896-467a-b08f-4d4ca991e872-ovn-node-metrics-cert\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84c6q\" (UniqueName: \"kubernetes.io/projected/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-kube-api-access-84c6q\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-modprobe-d\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-cni-bin\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.960908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-ovn\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysctl-conf\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72313447-1a66-4fcb-905e-c46123a74148-iptables-alerter-script\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-etc-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-system-cni-dir\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b10df83-fc14-4a92-9ad2-800fbd71b62e-cni-binary-copy\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-netns\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.961329 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-hostroot\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-slash\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-run-netns\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-cni-bin\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-conf-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-ovnkube-config\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-kubernetes\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-lib-modules\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-var-lib-kubelet\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/023c27e3-86e9-4182-bf8f-c7b6197bc958-host\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmzs\" (UniqueName: \"kubernetes.io/projected/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-kube-api-access-ppmzs\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d44f31c4-687d-42a9-b6c4-230bd1f85d32-konnectivity-ca\") pod \"konnectivity-agent-zttd6\" (UID: \"d44f31c4-687d-42a9-b6c4-230bd1f85d32\") " pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d44f31c4-687d-42a9-b6c4-230bd1f85d32-agent-certs\") pod \"konnectivity-agent-zttd6\" (UID: \"d44f31c4-687d-42a9-b6c4-230bd1f85d32\") " pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-run-ovn-kubernetes\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-sys\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.962099 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-ovnkube-script-lib\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-slash\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.961893 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-ovnkube-config\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-run-ovn-kubernetes\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-run-netns\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-cni-bin\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b10df83-fc14-4a92-9ad2-800fbd71b62e-cni-binary-copy\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962298 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-conf-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-netns\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962345 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-hostroot\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-hosts-file\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-system-cni-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-multus-certs\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962459 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysctl-d\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-system-cni-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-run-multus-certs\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.962989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-os-release\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.962583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-cni-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-os-release\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-socket-dir-parent\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-cni-dir\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-422cs\" (UniqueName: \"kubernetes.io/projected/7b10df83-fc14-4a92-9ad2-800fbd71b62e-kube-api-access-422cs\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72313447-1a66-4fcb-905e-c46123a74148-host-slash\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-sys-fs\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-os-release\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbkmj\" (UniqueName: \"kubernetes.io/projected/c33a8222-6663-4971-9e27-d05681becacf-kube-api-access-vbkmj\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-cnibin\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-cni-multus\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-socket-dir-parent\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-systemd\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-host-var-lib-cni-multus\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/023c27e3-86e9-4182-bf8f-c7b6197bc958-serviceca\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72313447-1a66-4fcb-905e-c46123a74148-host-slash\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-systemd\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.963798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-registration-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-run\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-daemon-config\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfrv\" (UniqueName: \"kubernetes.io/projected/72313447-1a66-4fcb-905e-c46123a74148-kube-api-access-ltfrv\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgq4m\" (UniqueName: \"kubernetes.io/projected/e12756ff-2896-467a-b08f-4d4ca991e872-kube-api-access-kgq4m\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-systemd\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-host\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/30ee9556-6a92-4520-a931-8d8ab472a6b5-tmp\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69rm\" (UniqueName: \"kubernetes.io/projected/023c27e3-86e9-4182-bf8f-c7b6197bc958-kube-api-access-c69rm\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cnibin\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cni-binary-copy\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b10df83-fc14-4a92-9ad2-800fbd71b62e-cnibin\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.964627 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-run-openvswitch\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.965349 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.963921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12756ff-2896-467a-b08f-4d4ca991e872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.965349 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.964655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12756ff-2896-467a-b08f-4d4ca991e872-env-overrides\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.965349 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.965117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b10df83-fc14-4a92-9ad2-800fbd71b62e-multus-daemon-config\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.966331 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.966308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d44f31c4-687d-42a9-b6c4-230bd1f85d32-agent-certs\") pod \"konnectivity-agent-zttd6\" (UID: \"d44f31c4-687d-42a9-b6c4-230bd1f85d32\") " pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:54.967906 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.967885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12756ff-2896-467a-b08f-4d4ca991e872-ovn-node-metrics-cert\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.976672 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.976642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-422cs\" (UniqueName: \"kubernetes.io/projected/7b10df83-fc14-4a92-9ad2-800fbd71b62e-kube-api-access-422cs\") pod \"multus-7frpv\" (UID: \"7b10df83-fc14-4a92-9ad2-800fbd71b62e\") " pod="openshift-multus/multus-7frpv" Apr 22 18:46:54.984465 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.984438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfrv\" (UniqueName: \"kubernetes.io/projected/72313447-1a66-4fcb-905e-c46123a74148-kube-api-access-ltfrv\") pod \"iptables-alerter-n7nbq\" (UID: \"72313447-1a66-4fcb-905e-c46123a74148\") " pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:54.984921 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.984893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgq4m\" (UniqueName: \"kubernetes.io/projected/e12756ff-2896-467a-b08f-4d4ca991e872-kube-api-access-kgq4m\") pod \"ovnkube-node-7zbwn\" (UID: \"e12756ff-2896-467a-b08f-4d4ca991e872\") " pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:54.985869 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.985850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbkmj\" (UniqueName: \"kubernetes.io/projected/c33a8222-6663-4971-9e27-d05681becacf-kube-api-access-vbkmj\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:54.992700 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.992652 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" event={"ID":"00b417b51c5d26c97b6e66b7df9d6ed9","Type":"ContainerStarted","Data":"329eed3923bd5833c1ecaed5e5c91c82a1452de129a2175259da0d7e633d059d"} Apr 22 18:46:54.993845 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:54.993821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" event={"ID":"9853c0f8048e7a952da729afa0e9e9f0","Type":"ContainerStarted","Data":"83be59299049e7fd08fd87e288951b7bfb1df43c4de076ce48aaaf44f52d6590"} Apr 22 18:46:55.064897 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.064830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/30ee9556-6a92-4520-a931-8d8ab472a6b5-tmp\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.064897 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.064867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c69rm\" (UniqueName: \"kubernetes.io/projected/023c27e3-86e9-4182-bf8f-c7b6197bc958-kube-api-access-c69rm\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.064897 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.064892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cnibin\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.064938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cnibin\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.064972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cni-binary-copy\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.064995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-device-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysconfig\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-tuned\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-etc-selinux\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-tmp-dir\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-socket-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.065199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnh2v\" (UniqueName: \"kubernetes.io/projected/30ee9556-6a92-4520-a931-8d8ab472a6b5-kube-api-access-nnh2v\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv48p\" (UniqueName: \"kubernetes.io/projected/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-kube-api-access-kv48p\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84c6q\" (UniqueName: \"kubernetes.io/projected/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-kube-api-access-84c6q\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-modprobe-d\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysctl-conf\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-system-cni-dir\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-kubernetes\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-lib-modules\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-var-lib-kubelet\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/023c27e3-86e9-4182-bf8f-c7b6197bc958-host\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-device-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/023c27e3-86e9-4182-bf8f-c7b6197bc958-host\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysconfig\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmzs\" (UniqueName: \"kubernetes.io/projected/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-kube-api-access-ppmzs\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.065780 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-sys\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-hosts-file\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysctl-d\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-os-release\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cni-binary-copy\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-sys-fs\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/023c27e3-86e9-4182-bf8f-c7b6197bc958-serviceca\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.065935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysctl-conf\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066043 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-modprobe-d\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-lib-modules\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-system-cni-dir\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-tmp-dir\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-socket-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/023c27e3-86e9-4182-bf8f-c7b6197bc958-serviceca\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-registration-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.066543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-run\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-hosts-file\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-sys\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-registration-dir\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-kubernetes\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-run\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-var-lib-kubelet\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-systemd\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-host\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-os-release\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-systemd\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-host\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-sysctl-d\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-etc-selinux\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.066841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-sys-fs\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.067307 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.067049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.068146 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.068119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/30ee9556-6a92-4520-a931-8d8ab472a6b5-tmp\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.068520 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.068502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/30ee9556-6a92-4520-a931-8d8ab472a6b5-etc-tuned\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.070988 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.070964 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:55.071092 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.070991 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:55.071092 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.071005 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:55.071092 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.071067 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:55.571048878 +0000 UTC m=+3.095712408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:55.073578 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.073545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69rm\" (UniqueName: \"kubernetes.io/projected/023c27e3-86e9-4182-bf8f-c7b6197bc958-kube-api-access-c69rm\") pod \"node-ca-5t6hp\" (UID: \"023c27e3-86e9-4182-bf8f-c7b6197bc958\") " pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.074758 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.074715 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnh2v\" (UniqueName: \"kubernetes.io/projected/30ee9556-6a92-4520-a931-8d8ab472a6b5-kube-api-access-nnh2v\") pod \"tuned-4h9jv\" (UID: \"30ee9556-6a92-4520-a931-8d8ab472a6b5\") " pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.075382 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.075246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmzs\" (UniqueName: \"kubernetes.io/projected/77f410ec-92d3-4e11-871d-3bd6da0e0d1f-kube-api-access-ppmzs\") pod \"node-resolver-k85gs\" (UID: \"77f410ec-92d3-4e11-871d-3bd6da0e0d1f\") " pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.075722 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.075700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv48p\" (UniqueName: \"kubernetes.io/projected/b6c4f00f-22a0-4f0d-bbe0-8b9038175a35-kube-api-access-kv48p\") pod \"multus-additional-cni-plugins-hm5p8\" (UID: \"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35\") " pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.076234 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.076214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84c6q\" (UniqueName: \"kubernetes.io/projected/28bacb0a-0ce3-4f72-9043-d4bdc3c704eb-kube-api-access-84c6q\") pod \"aws-ebs-csi-driver-node-6vsm4\" (UID: \"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.155149 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.155112 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7frpv" Apr 22 18:46:55.164962 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.164937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n7nbq" Apr 22 18:46:55.169897 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.169876 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:55.174910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.174886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:46:55.180624 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.180603 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:46:55.189198 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.189175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k85gs" Apr 22 18:46:55.196846 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.196825 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" Apr 22 18:46:55.204384 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.204365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" Apr 22 18:46:55.213902 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.213884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5t6hp" Apr 22 18:46:55.218159 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.218136 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:55.221109 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.221090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" Apr 22 18:46:55.471591 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.471507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:55.471740 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.471644 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:55.471740 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.471701 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.471687628 +0000 UTC m=+3.996351157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:55.572386 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.572354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:55.572541 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.572511 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:55.572541 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.572534 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:55.572607 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.572544 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:55.572607 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:55.572602 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.572581563 +0000 UTC m=+4.097245093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:55.888319 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.888232 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:53 +0000 UTC" deadline="2027-10-29 22:49:16.88974257 +0000 UTC" Apr 22 18:46:55.888319 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.888261 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13324h2m21.00148423s" Apr 22 18:46:55.964685 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.964660 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod023c27e3_86e9_4182_bf8f_c7b6197bc958.slice/crio-f3a2afb86330171065d1d7b3b41bc66d73445934d27f3141eda6a0a066a53fda WatchSource:0}: Error finding container f3a2afb86330171065d1d7b3b41bc66d73445934d27f3141eda6a0a066a53fda: Status 404 returned error can't find the container with id f3a2afb86330171065d1d7b3b41bc66d73445934d27f3141eda6a0a066a53fda Apr 22 18:46:55.966046 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.965951 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c4f00f_22a0_4f0d_bbe0_8b9038175a35.slice/crio-8462a6d17b3ce4f9f37e4e201aa936b86c28d9bcbd03f12e87181ad12e27c9b4 WatchSource:0}: Error finding container 8462a6d17b3ce4f9f37e4e201aa936b86c28d9bcbd03f12e87181ad12e27c9b4: Status 404 returned error can't find the container with id 8462a6d17b3ce4f9f37e4e201aa936b86c28d9bcbd03f12e87181ad12e27c9b4 Apr 22 18:46:55.967504 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.967473 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f410ec_92d3_4e11_871d_3bd6da0e0d1f.slice/crio-488504fffbb1f3043d701d7c142132bbdee30bdcaff2825f97df08a0310557a7 WatchSource:0}: Error finding container 488504fffbb1f3043d701d7c142132bbdee30bdcaff2825f97df08a0310557a7: Status 404 returned error can't find the container with id 488504fffbb1f3043d701d7c142132bbdee30bdcaff2825f97df08a0310557a7 Apr 22 18:46:55.971323 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.971297 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd44f31c4_687d_42a9_b6c4_230bd1f85d32.slice/crio-f2868be2b73b75e2b0e13e3b12112f8c4d4f3ab6d679a895b1901059a83734ec WatchSource:0}: Error finding container f2868be2b73b75e2b0e13e3b12112f8c4d4f3ab6d679a895b1901059a83734ec: Status 404 returned error can't find the container with id f2868be2b73b75e2b0e13e3b12112f8c4d4f3ab6d679a895b1901059a83734ec Apr 22 18:46:55.972553 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.972530 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12756ff_2896_467a_b08f_4d4ca991e872.slice/crio-fcf03d0e533b26b3add3bf5cf5ea668e019a83e6465b1553bdcdaac88c40c4a3 WatchSource:0}: Error finding container fcf03d0e533b26b3add3bf5cf5ea668e019a83e6465b1553bdcdaac88c40c4a3: Status 404 returned error can't find the container with id fcf03d0e533b26b3add3bf5cf5ea668e019a83e6465b1553bdcdaac88c40c4a3 Apr 22 18:46:55.973898 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.973798 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28bacb0a_0ce3_4f72_9043_d4bdc3c704eb.slice/crio-398afc170711c693cafc3a985c9585c3a89d9d48ef4d6c8d7b322db05fa8073b WatchSource:0}: Error finding container 398afc170711c693cafc3a985c9585c3a89d9d48ef4d6c8d7b322db05fa8073b: Status 404 returned error can't find the container with id 398afc170711c693cafc3a985c9585c3a89d9d48ef4d6c8d7b322db05fa8073b Apr 22 18:46:55.974222 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.974197 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b10df83_fc14_4a92_9ad2_800fbd71b62e.slice/crio-ea0d2b106fbf3b59a6aa6161d749336166bd2e0cd9a447183559fc6b721f854a WatchSource:0}: Error finding container ea0d2b106fbf3b59a6aa6161d749336166bd2e0cd9a447183559fc6b721f854a: Status 404 returned error can't find the container with id ea0d2b106fbf3b59a6aa6161d749336166bd2e0cd9a447183559fc6b721f854a Apr 22 18:46:55.975917 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:46:55.975724 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ee9556_6a92_4520_a931_8d8ab472a6b5.slice/crio-622d4ff0c2d8afd54b5ec2832bd4fc153241b480b0071b109c3ce3b3b8faa925 WatchSource:0}: Error finding container 622d4ff0c2d8afd54b5ec2832bd4fc153241b480b0071b109c3ce3b3b8faa925: Status 404 returned error can't find the container with id 622d4ff0c2d8afd54b5ec2832bd4fc153241b480b0071b109c3ce3b3b8faa925 Apr 22 18:46:55.998157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.996004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" event={"ID":"30ee9556-6a92-4520-a931-8d8ab472a6b5","Type":"ContainerStarted","Data":"622d4ff0c2d8afd54b5ec2832bd4fc153241b480b0071b109c3ce3b3b8faa925"} Apr 22 18:46:55.999312 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:55.999286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7frpv" event={"ID":"7b10df83-fc14-4a92-9ad2-800fbd71b62e","Type":"ContainerStarted","Data":"ea0d2b106fbf3b59a6aa6161d749336166bd2e0cd9a447183559fc6b721f854a"} Apr 22 18:46:56.000223 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.000191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" event={"ID":"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb","Type":"ContainerStarted","Data":"398afc170711c693cafc3a985c9585c3a89d9d48ef4d6c8d7b322db05fa8073b"} Apr 22 18:46:56.001238 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.001217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"fcf03d0e533b26b3add3bf5cf5ea668e019a83e6465b1553bdcdaac88c40c4a3"} Apr 22 18:46:56.002129 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.002109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zttd6" event={"ID":"d44f31c4-687d-42a9-b6c4-230bd1f85d32","Type":"ContainerStarted","Data":"f2868be2b73b75e2b0e13e3b12112f8c4d4f3ab6d679a895b1901059a83734ec"} Apr 22 18:46:56.003060 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.003038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k85gs" event={"ID":"77f410ec-92d3-4e11-871d-3bd6da0e0d1f","Type":"ContainerStarted","Data":"488504fffbb1f3043d701d7c142132bbdee30bdcaff2825f97df08a0310557a7"} Apr 22 18:46:56.004045 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.004017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n7nbq" event={"ID":"72313447-1a66-4fcb-905e-c46123a74148","Type":"ContainerStarted","Data":"8ff4cd65cf13f635f7eb1d2e705d1cb9531e6d755eef30ab2afeec563ce0b44f"} Apr 22 18:46:56.004947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.004925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerStarted","Data":"8462a6d17b3ce4f9f37e4e201aa936b86c28d9bcbd03f12e87181ad12e27c9b4"} Apr 22 18:46:56.005922 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.005904 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5t6hp" event={"ID":"023c27e3-86e9-4182-bf8f-c7b6197bc958","Type":"ContainerStarted","Data":"f3a2afb86330171065d1d7b3b41bc66d73445934d27f3141eda6a0a066a53fda"} Apr 22 18:46:56.478910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.478871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:56.479087 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.479008 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.479087 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.479072 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:58.479053631 +0000 UTC m=+6.003717185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.580217 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.580150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:56.580384 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.580368 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:56.580467 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.580389 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:56.580467 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.580401 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.580467 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.580465 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:58.580447256 +0000 UTC m=+6.105110789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.997148 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.997117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:56.997662 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.997311 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:46:56.997826 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:56.997760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:56.997882 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:56.997861 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:46:57.033964 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:57.033872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" event={"ID":"00b417b51c5d26c97b6e66b7df9d6ed9","Type":"ContainerStarted","Data":"1263fc522f42972c945000e1bd3329347be1336a891c94d291ae653c5f46da22"} Apr 22 18:46:57.043287 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:57.043240 2577 generic.go:358] "Generic (PLEG): container finished" podID="9853c0f8048e7a952da729afa0e9e9f0" containerID="29e0a081fa44b07a8fb54842667afc3686ce7bd6ff700c56fcab8d47bf0fc46a" exitCode=0 Apr 22 18:46:57.043457 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:57.043304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" event={"ID":"9853c0f8048e7a952da729afa0e9e9f0","Type":"ContainerDied","Data":"29e0a081fa44b07a8fb54842667afc3686ce7bd6ff700c56fcab8d47bf0fc46a"} Apr 22 18:46:57.051082 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:57.050492 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-56.ec2.internal" podStartSLOduration=3.050474187 podStartE2EDuration="3.050474187s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:57.049930904 +0000 UTC m=+4.574594455" watchObservedRunningTime="2026-04-22 18:46:57.050474187 +0000 UTC m=+4.575137740" Apr 22 18:46:58.069791 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:58.069749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" event={"ID":"9853c0f8048e7a952da729afa0e9e9f0","Type":"ContainerStarted","Data":"fe895c810cfc126e96e57ca0beb5063824713745234fcbc37bb24c9b638fd9de"} Apr 22 18:46:58.501806 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:58.501629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:58.501975 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.501820 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:58.501975 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.501898 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.501877251 +0000 UTC m=+10.026540796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:58.603495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:58.602805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:58.603495 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.603015 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:58.603495 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.603034 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:58.603495 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.603047 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:58.603495 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.603107 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.603086781 +0000 UTC m=+10.127750319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:58.987305 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:58.986415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:46:58.987305 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.986540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:46:58.987305 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:46:58.987076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:46:58.987305 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:46:58.987196 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:00.986962 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:00.986922 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:00.987488 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:00.987055 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:00.987488 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:00.987135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:00.987488 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:00.987216 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:02.535826 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:02.535749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:02.536215 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.535906 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:02.536215 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.535970 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.535954408 +0000 UTC m=+18.060617941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:02.636388 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:02.636300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:02.636571 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.636471 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:02.636571 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.636491 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:02.636571 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.636506 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:02.636571 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.636556 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.636538901 +0000 UTC m=+18.161202446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:02.989928 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:02.989791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:02.990059 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.989921 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:02.990115 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:02.990096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:02.990205 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:02.990186 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:04.989580 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:04.989547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:04.989976 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:04.989549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:04.989976 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:04.989664 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:04.989976 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:04.989769 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:06.986762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:06.986506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:06.987298 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:06.986580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:06.987298 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:06.986874 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:06.987298 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:06.986957 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:08.986520 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:08.986421 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:08.986978 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:08.986544 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:08.986978 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:08.986611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:08.986978 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:08.986739 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:10.588049 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:10.588014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:10.588535 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.588141 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:10.588535 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.588197 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:47:26.588177493 +0000 UTC m=+34.112841022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:10.689177 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:10.689133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:10.689487 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.689306 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:10.689487 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.689327 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:10.689487 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.689341 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:10.689487 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.689399 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:26.689380722 +0000 UTC m=+34.214044263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:10.986101 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:10.986021 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:10.986259 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.986156 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:10.986259 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:10.986211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:10.986373 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:10.986342 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:12.986761 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:12.986718 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:12.987211 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:12.986806 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:12.987211 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:12.986883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:12.987211 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:12.986971 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:14.101351 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.101067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5t6hp" event={"ID":"023c27e3-86e9-4182-bf8f-c7b6197bc958","Type":"ContainerStarted","Data":"22c1c8921186d4a2ed2e07bbe91a2c7b4c39163b96dd78fbd8171df1e5592aa6"} Apr 22 18:47:14.102562 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.102531 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" event={"ID":"30ee9556-6a92-4520-a931-8d8ab472a6b5","Type":"ContainerStarted","Data":"7ee11307b1847da248b74531cd96447e6b63817004e0d2a53a94fa02591881c0"} Apr 22 18:47:14.103987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.103962 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7frpv" event={"ID":"7b10df83-fc14-4a92-9ad2-800fbd71b62e","Type":"ContainerStarted","Data":"6dd0c80fda27a03d0b760f999530fe9bfd2b5b8667cdd8b843d6249a3b2ed6e8"} Apr 22 18:47:14.105514 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.105480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" event={"ID":"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb","Type":"ContainerStarted","Data":"1b7ff776b66b5881f7e236ea6a88b2d24ee0648ab5ee1853ec126da8cb5ca1b1"} Apr 22 18:47:14.108318 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.108295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"7e89cae35fba82e862a86a9d1d91d352ee5e23102cd4861deef0a0001b410ca2"} Apr 22 18:47:14.108406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.108327 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"d0bc2122471b138936dbab17e7177e37fcb8b4677f1df499fa9ff267bd691288"} Apr 22 18:47:14.108406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.108340 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"23a7e88e710a483cad13931956489fd9959fbea97134e6ddfc9eec2ad6aad6d2"} Apr 22 18:47:14.108406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.108352 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"b857210baf264801ab0abe33c55a742509ac110d892c7d57c888b6d739b9d882"} Apr 22 18:47:14.108406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.108367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"1ed4a1134c039001e92fb7f18cedec9fed1f44b425c4b24472ea93a243e09a81"} Apr 22 18:47:14.108406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.108378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"eae37a430932537d78922694eefd0e85c218f06daede219b4aa0b88a478f1f91"} Apr 22 18:47:14.109631 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.109603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zttd6" event={"ID":"d44f31c4-687d-42a9-b6c4-230bd1f85d32","Type":"ContainerStarted","Data":"ddaf924f149865687da049d141534ffb5ab2c8d0c8ff51d11656d24a771e7827"} Apr 22 18:47:14.110977 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.110947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k85gs" event={"ID":"77f410ec-92d3-4e11-871d-3bd6da0e0d1f","Type":"ContainerStarted","Data":"6ed427d2c7bab87506360d96ab2b88f2520151590d8b4dcda0836c6de2021da8"} Apr 22 18:47:14.112428 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.112401 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6c4f00f-22a0-4f0d-bbe0-8b9038175a35" containerID="f37cbc7155a24aef5600327e66d710b34b2da29b38af7386615445ea492e6138" exitCode=0 Apr 22 18:47:14.112522 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.112441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerDied","Data":"f37cbc7155a24aef5600327e66d710b34b2da29b38af7386615445ea492e6138"} Apr 22 18:47:14.115549 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.115501 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5t6hp" podStartSLOduration=3.9665298829999998 podStartE2EDuration="21.11548657s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.966738829 +0000 UTC m=+3.491402360" lastFinishedPulling="2026-04-22 18:47:13.115695507 +0000 UTC m=+20.640359047" observedRunningTime="2026-04-22 18:47:14.115423737 +0000 UTC m=+21.640087289" watchObservedRunningTime="2026-04-22 18:47:14.11548657 +0000 UTC m=+21.640150124" Apr 22 18:47:14.115820 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.115787 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-56.ec2.internal" podStartSLOduration=20.115779976 podStartE2EDuration="20.115779976s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:58.083978839 +0000 UTC m=+5.608642445" watchObservedRunningTime="2026-04-22 18:47:14.115779976 +0000 UTC m=+21.640443526" Apr 22 18:47:14.145637 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.145589 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7frpv" podStartSLOduration=3.9709095530000003 podStartE2EDuration="21.145575397s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.977126562 +0000 UTC m=+3.501790092" lastFinishedPulling="2026-04-22 18:47:13.151792407 +0000 UTC m=+20.676455936" observedRunningTime="2026-04-22 18:47:14.145346755 +0000 UTC m=+21.670010309" watchObservedRunningTime="2026-04-22 18:47:14.145575397 +0000 UTC m=+21.670238927" Apr 22 18:47:14.145924 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.145896 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zttd6" podStartSLOduration=8.765871214 podStartE2EDuration="21.145889729s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.973669349 +0000 UTC m=+3.498332881" lastFinishedPulling="2026-04-22 18:47:08.353687863 +0000 UTC m=+15.878351396" observedRunningTime="2026-04-22 18:47:14.129053659 +0000 UTC m=+21.653717212" watchObservedRunningTime="2026-04-22 18:47:14.145889729 +0000 UTC m=+21.670553280" Apr 22 18:47:14.159282 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.159172 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k85gs" podStartSLOduration=4.013585859 podStartE2EDuration="21.159157411s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.970106164 +0000 UTC m=+3.494769707" lastFinishedPulling="2026-04-22 18:47:13.115677727 +0000 UTC m=+20.640341259" observedRunningTime="2026-04-22 18:47:14.158861157 +0000 UTC m=+21.683524705" watchObservedRunningTime="2026-04-22 18:47:14.159157411 +0000 UTC m=+21.683820963" Apr 22 18:47:14.178366 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.178325 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4h9jv" podStartSLOduration=4.044165498 podStartE2EDuration="21.178307124s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.981401698 +0000 UTC m=+3.506065242" lastFinishedPulling="2026-04-22 18:47:13.115543319 +0000 UTC m=+20.640206868" observedRunningTime="2026-04-22 18:47:14.17806837 +0000 UTC m=+21.702731920" watchObservedRunningTime="2026-04-22 18:47:14.178307124 +0000 UTC m=+21.702970669" Apr 22 18:47:14.299011 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.298864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:47:14.299485 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.299467 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:47:14.320166 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.320144 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:47:14.927708 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.927549 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:47:14.320161991Z","UUID":"daaf62f8-14a9-4463-9664-57b80a1343fc","Handler":null,"Name":"","Endpoint":""} Apr 22 18:47:14.931033 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.931006 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:47:14.931033 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.931039 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:47:14.987081 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.987048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:14.987241 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:14.987173 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:14.987622 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:14.987601 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:14.987744 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:14.987706 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:15.116377 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:15.116344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" event={"ID":"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb","Type":"ContainerStarted","Data":"c16ed409df2fead5750360457df8e5eff118c4f38d79d5452ee014a1c8210097"} Apr 22 18:47:15.117951 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:15.117918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n7nbq" event={"ID":"72313447-1a66-4fcb-905e-c46123a74148","Type":"ContainerStarted","Data":"829de98a7256062030ac4e2d93f96b633c1f31b2eaf0c0e8c8a8ae02e082fdf3"} Apr 22 18:47:15.131969 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:15.131917 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n7nbq" podStartSLOduration=4.996532574 podStartE2EDuration="22.13189917s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.980336775 +0000 UTC m=+3.505000318" lastFinishedPulling="2026-04-22 18:47:13.115703384 +0000 UTC m=+20.640366914" observedRunningTime="2026-04-22 18:47:15.131393091 +0000 UTC m=+22.656056644" watchObservedRunningTime="2026-04-22 18:47:15.13189917 +0000 UTC m=+22.656562723" Apr 22 18:47:16.121837 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:16.121739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" event={"ID":"28bacb0a-0ce3-4f72-9043-d4bdc3c704eb","Type":"ContainerStarted","Data":"951742694b5720b975e036e1910f68eda7c72eb3227b177c29d4fb4db9a6a177"} Apr 22 18:47:16.124499 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:16.124474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"64fb6c5c9ce39b6f97d521767ceb8ddcb3ac926412e3f5650f94d9fc86cfcae0"} Apr 22 18:47:16.124623 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:16.124514 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:16.986244 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:16.986209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:16.986244 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:16.986246 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:16.986495 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:16.986362 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:16.986495 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:16.986468 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:18.132496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.132248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" event={"ID":"e12756ff-2896-467a-b08f-4d4ca991e872","Type":"ContainerStarted","Data":"253b63179a11af8f01400508982337a12a32b8910ac0582455f5568257df0d28"} Apr 22 18:47:18.133090 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.132569 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:47:18.133090 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.132585 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:47:18.148123 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.148096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:47:18.157853 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.157811 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vsm4" podStartSLOduration=5.911524158 podStartE2EDuration="25.157798344s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.975775529 +0000 UTC m=+3.500439074" lastFinishedPulling="2026-04-22 18:47:15.222049728 +0000 UTC m=+22.746713260" observedRunningTime="2026-04-22 18:47:16.139722554 +0000 UTC m=+23.664386107" watchObservedRunningTime="2026-04-22 18:47:18.157798344 +0000 UTC m=+25.682461895" Apr 22 18:47:18.158325 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.158296 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" podStartSLOduration=7.93468073 podStartE2EDuration="25.158288363s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.975256782 +0000 UTC m=+3.499920325" lastFinishedPulling="2026-04-22 18:47:13.198864426 +0000 UTC m=+20.723527958" observedRunningTime="2026-04-22 18:47:18.157606846 +0000 UTC m=+25.682270399" watchObservedRunningTime="2026-04-22 18:47:18.158288363 +0000 UTC m=+25.682951911" Apr 22 18:47:18.986353 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.986322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:18.986353 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:18.986350 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:18.986586 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:18.986439 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:18.986586 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:18.986571 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:19.135401 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:19.135334 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:47:19.152970 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:19.152943 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:47:19.864023 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:19.863987 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4q2cb"] Apr 22 18:47:19.864195 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:19.864143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:19.864321 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:19.864264 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:19.866246 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:19.866218 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h4b8s"] Apr 22 18:47:19.866386 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:19.866360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:19.866473 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:19.866452 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:20.986980 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:20.986757 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:20.987351 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:20.987096 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:21.139144 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:21.139062 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6c4f00f-22a0-4f0d-bbe0-8b9038175a35" containerID="a82ad7f1fa13c5cc23dea3bd36a381ca2ce918483c9ad26a3d07cda0f3b2f594" exitCode=0 Apr 22 18:47:21.139318 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:21.139154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerDied","Data":"a82ad7f1fa13c5cc23dea3bd36a381ca2ce918483c9ad26a3d07cda0f3b2f594"} Apr 22 18:47:21.986334 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:21.986299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:21.986510 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:21.986437 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:22.987066 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:22.987036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:22.987576 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:22.987122 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:23.143789 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:23.143756 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6c4f00f-22a0-4f0d-bbe0-8b9038175a35" containerID="26ab9ab880e040871df4a69adbe004e3bb1e0e8ef212940de62918c7a7348003" exitCode=0 Apr 22 18:47:23.143918 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:23.143819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerDied","Data":"26ab9ab880e040871df4a69adbe004e3bb1e0e8ef212940de62918c7a7348003"} Apr 22 18:47:23.986135 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:23.986105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:23.986294 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:23.986211 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:24.617390 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:24.617308 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:47:24.617729 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:24.617438 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:24.617996 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:24.617972 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zttd6" Apr 22 18:47:24.986625 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:24.986597 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:24.986791 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:24.986686 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h4b8s" podUID="14401ce0-56c3-41fc-9d81-5b7fae368b4c" Apr 22 18:47:25.149411 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:25.149376 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6c4f00f-22a0-4f0d-bbe0-8b9038175a35" containerID="e9608a0931993f20249d2e2773e08edca00f7b67a3c2dfdd97b3a565c5103ba5" exitCode=0 Apr 22 18:47:25.149576 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:25.149421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerDied","Data":"e9608a0931993f20249d2e2773e08edca00f7b67a3c2dfdd97b3a565c5103ba5"} Apr 22 18:47:25.986670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:25.986641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:25.987379 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:25.986782 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4q2cb" podUID="c33a8222-6663-4971-9e27-d05681becacf" Apr 22 18:47:26.292902 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.292875 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-56.ec2.internal" event="NodeReady" Apr 22 18:47:26.293076 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.293012 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:26.334602 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.334570 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qnb4b"] Apr 22 18:47:26.359132 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.359007 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-68s2k"] Apr 22 18:47:26.359132 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.359108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:26.362038 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.362018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdfh7\"" Apr 22 18:47:26.362623 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.362605 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:26.362839 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.362818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:26.363025 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.362976 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:26.383299 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.383259 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnb4b"] Apr 22 18:47:26.383448 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.383310 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-68s2k"] Apr 22 18:47:26.383448 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.383337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.385841 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.385786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctxn4\"" Apr 22 18:47:26.386007 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.385988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:26.386220 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.386206 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:26.507520 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.507475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnh7w\" (UniqueName: \"kubernetes.io/projected/130369e7-d304-4500-9ad6-18b8f2f4e731-kube-api-access-gnh7w\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.507691 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.507546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/130369e7-d304-4500-9ad6-18b8f2f4e731-tmp-dir\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.507691 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.507650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/130369e7-d304-4500-9ad6-18b8f2f4e731-config-volume\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.507802 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.507693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.507802 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.507719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:26.507802 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.507751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrpb\" (UniqueName: \"kubernetes.io/projected/be593d71-f465-4468-8034-246bf4c51e73-kube-api-access-tgrpb\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:26.608867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.608780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/130369e7-d304-4500-9ad6-18b8f2f4e731-config-volume\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.608867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.608819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.608867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.608838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:26.609154 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.608931 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:26.609154 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.608937 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:26.609154 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.608994 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.108974989 +0000 UTC m=+34.633638544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:26.609154 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.609008 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.10900197 +0000 UTC m=+34.633665499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:26.609154 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.608987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrpb\" (UniqueName: \"kubernetes.io/projected/be593d71-f465-4468-8034-246bf4c51e73-kube-api-access-tgrpb\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:26.609154 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.609132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnh7w\" (UniqueName: \"kubernetes.io/projected/130369e7-d304-4500-9ad6-18b8f2f4e731-kube-api-access-gnh7w\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.609510 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.609167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:26.609510 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.609207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/130369e7-d304-4500-9ad6-18b8f2f4e731-tmp-dir\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.609510 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.609357 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:26.609510 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.609417 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:47:58.609392804 +0000 UTC m=+66.134056336 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:26.609712 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.609546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/130369e7-d304-4500-9ad6-18b8f2f4e731-tmp-dir\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.613418 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.613395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/130369e7-d304-4500-9ad6-18b8f2f4e731-config-volume\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.619787 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.619766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnh7w\" (UniqueName: \"kubernetes.io/projected/130369e7-d304-4500-9ad6-18b8f2f4e731-kube-api-access-gnh7w\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:26.619893 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.619875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrpb\" (UniqueName: \"kubernetes.io/projected/be593d71-f465-4468-8034-246bf4c51e73-kube-api-access-tgrpb\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:26.710689 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.710506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:26.710689 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.710690 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:26.710921 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.710712 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:26.710921 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.710727 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gln4q for pod openshift-network-diagnostics/network-check-target-h4b8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:26.710921 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:26.710807 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q podName:14401ce0-56c3-41fc-9d81-5b7fae368b4c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:58.710786574 +0000 UTC m=+66.235450107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gln4q" (UniqueName: "kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q") pod "network-check-target-h4b8s" (UID: "14401ce0-56c3-41fc-9d81-5b7fae368b4c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:26.986997 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.986957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:26.990016 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.989991 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:26.990167 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.990020 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:26.990794 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:26.990774 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhrzn\"" Apr 22 18:47:27.113317 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:27.113280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:27.113485 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:27.113325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:27.113485 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:27.113437 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:27.113608 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:27.113497 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:28.113482654 +0000 UTC m=+35.638146187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:27.113608 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:27.113505 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:27.113717 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:27.113633 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:28.113574313 +0000 UTC m=+35.638237853 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:27.986891 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:27.986855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:27.989471 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:27.989444 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:27.989996 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:27.989476 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9rzz8\"" Apr 22 18:47:28.121505 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:28.121467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:28.121505 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:28.121505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:28.121744 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:28.121659 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:28.121744 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:28.121699 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:28.121744 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:28.121725 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.121705439 +0000 UTC m=+37.646368985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:28.121896 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:28.121761 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.121742408 +0000 UTC m=+37.646405940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:30.138487 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:30.138434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:30.138487 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:30.138489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:30.138950 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:30.138590 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:30.138950 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:30.138624 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:30.138950 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:30.138675 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.138653545 +0000 UTC m=+41.663317097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:30.138950 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:30.138694 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.138686119 +0000 UTC m=+41.663349648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:32.164687 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:32.164649 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6c4f00f-22a0-4f0d-bbe0-8b9038175a35" containerID="ad10a2a697c886fe696c2dc9528ec9286c94439abec8e6d2e02e93bd8683bc0b" exitCode=0 Apr 22 18:47:32.165035 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:32.164693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerDied","Data":"ad10a2a697c886fe696c2dc9528ec9286c94439abec8e6d2e02e93bd8683bc0b"} Apr 22 18:47:33.168812 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:33.168776 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6c4f00f-22a0-4f0d-bbe0-8b9038175a35" containerID="727e7ed54b9c5e361a57b7fd105c9df9cf36985b0cb4bd7e366ede391e2ab292" exitCode=0 Apr 22 18:47:33.169193 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:33.168820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerDied","Data":"727e7ed54b9c5e361a57b7fd105c9df9cf36985b0cb4bd7e366ede391e2ab292"} Apr 22 18:47:34.169644 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:34.169610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:34.169644 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:34.169647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:34.170167 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:34.169743 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:34.170167 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:34.169744 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:34.170167 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:34.169807 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.169790061 +0000 UTC m=+49.694453591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:34.170167 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:34.169821 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.169814775 +0000 UTC m=+49.694478304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:34.173606 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:34.173582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" event={"ID":"b6c4f00f-22a0-4f0d-bbe0-8b9038175a35","Type":"ContainerStarted","Data":"2cde2766c79cd71947b1497a89c54bc7a83970a980b2eb536f8b8233e4d3eda4"} Apr 22 18:47:34.194102 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:34.194057 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hm5p8" podStartSLOduration=6.124894562 podStartE2EDuration="41.194045861s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:46:55.967921937 +0000 UTC m=+3.492585480" lastFinishedPulling="2026-04-22 18:47:31.03707325 +0000 UTC m=+38.561736779" observedRunningTime="2026-04-22 18:47:34.193726003 +0000 UTC m=+41.718389554" watchObservedRunningTime="2026-04-22 18:47:34.194045861 +0000 UTC m=+41.718709413" Apr 22 18:47:42.225152 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:42.225114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:42.225152 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:42.225153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:42.225658 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:42.225288 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:42.225658 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:42.225348 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:58.225334637 +0000 UTC m=+65.749998171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:42.225658 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:42.225288 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:42.225658 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:42.225424 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:58.225409724 +0000 UTC m=+65.750073267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:51.149933 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:51.149902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7zbwn" Apr 22 18:47:58.230891 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.230858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:47:58.230891 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.230899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:47:58.231455 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:58.230999 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:58.231455 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:58.231066 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.231045207 +0000 UTC m=+97.755708743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:47:58.231455 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:58.230999 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:58.231455 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:58.231139 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.231125891 +0000 UTC m=+97.755789424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:47:58.632901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.632805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:47:58.635517 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.635495 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:58.643625 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:58.643605 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:58.643729 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:47:58.643676 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs podName:c33a8222-6663-4971-9e27-d05681becacf nodeName:}" failed. No retries permitted until 2026-04-22 18:49:02.643655195 +0000 UTC m=+130.168318728 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs") pod "network-metrics-daemon-4q2cb" (UID: "c33a8222-6663-4971-9e27-d05681becacf") : secret "metrics-daemon-secret" not found Apr 22 18:47:58.733706 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.733664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:58.737020 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.736993 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:58.746698 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.746678 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:58.757727 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.757702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gln4q\" (UniqueName: \"kubernetes.io/projected/14401ce0-56c3-41fc-9d81-5b7fae368b4c-kube-api-access-gln4q\") pod \"network-check-target-h4b8s\" (UID: \"14401ce0-56c3-41fc-9d81-5b7fae368b4c\") " pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:58.801348 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.801318 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhrzn\"" Apr 22 18:47:58.809218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.809201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:47:58.930320 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:58.930286 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h4b8s"] Apr 22 18:47:58.933949 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:47:58.933922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14401ce0_56c3_41fc_9d81_5b7fae368b4c.slice/crio-8b143bca011d92d6132bf07150b59022269e777337b11f7a9d61728dd214f072 WatchSource:0}: Error finding container 8b143bca011d92d6132bf07150b59022269e777337b11f7a9d61728dd214f072: Status 404 returned error can't find the container with id 8b143bca011d92d6132bf07150b59022269e777337b11f7a9d61728dd214f072 Apr 22 18:47:59.218656 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:47:59.218620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h4b8s" event={"ID":"14401ce0-56c3-41fc-9d81-5b7fae368b4c","Type":"ContainerStarted","Data":"8b143bca011d92d6132bf07150b59022269e777337b11f7a9d61728dd214f072"} Apr 22 18:48:02.225381 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:02.225344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h4b8s" event={"ID":"14401ce0-56c3-41fc-9d81-5b7fae368b4c","Type":"ContainerStarted","Data":"f1c8e22a00058b98f09cf84124fb22e9820e21a40c457f45c5b64f65d6774c43"} Apr 22 18:48:02.225833 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:02.225511 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:48:02.240105 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:02.239994 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-h4b8s" podStartSLOduration=66.282733393 podStartE2EDuration="1m9.239978624s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:47:58.937503279 +0000 UTC m=+66.462166812" lastFinishedPulling="2026-04-22 18:48:01.894748511 +0000 UTC m=+69.419412043" observedRunningTime="2026-04-22 18:48:02.239577027 +0000 UTC m=+69.764240578" watchObservedRunningTime="2026-04-22 18:48:02.239978624 +0000 UTC m=+69.764642176" Apr 22 18:48:22.276573 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.276539 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx"] Apr 22 18:48:22.280575 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.280557 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.282664 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.282642 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-9c6c9466d-5zwxc"] Apr 22 18:48:22.283234 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.283212 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:48:22.283348 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.283247 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:48:22.287458 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.285095 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-skxpn\"" Apr 22 18:48:22.287458 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.285233 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:48:22.287458 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.285363 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:48:22.288470 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.288452 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx"] Apr 22 18:48:22.288578 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.288565 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.290940 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.290916 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:48:22.290940 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.290936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:48:22.291097 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.291046 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:48:22.291178 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.291161 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:48:22.291239 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.291169 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:48:22.291429 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.291412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:48:22.291429 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.291424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kx5jk\"" Apr 22 18:48:22.294897 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.294875 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-9c6c9466d-5zwxc"] Apr 22 18:48:22.395859 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.395833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-default-certificate\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.396012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.395876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.396012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.395894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.396012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.395937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-stats-auth\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.396012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.395979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.396012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.395999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9kq\" (UniqueName: \"kubernetes.io/projected/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-kube-api-access-ln9kq\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.396183 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.396061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szmg8\" (UniqueName: \"kubernetes.io/projected/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-kube-api-access-szmg8\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.396183 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.396093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.497344 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-default-certificate\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.497534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.497534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.497534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-stats-auth\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.497534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.497534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9kq\" (UniqueName: \"kubernetes.io/projected/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-kube-api-access-ln9kq\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.497534 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:22.497528 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:22.497820 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:22.497555 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:22.497820 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:22.497570 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:22.997542656 +0000 UTC m=+90.522206209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:22.497820 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:22.497606 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:22.99758829 +0000 UTC m=+90.522251819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : secret "router-metrics-certs-default" not found Apr 22 18:48:22.497820 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szmg8\" (UniqueName: \"kubernetes.io/projected/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-kube-api-access-szmg8\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.497820 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:22.497674 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls podName:76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:22.9976529 +0000 UTC m=+90.522316433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7fshx" (UID: "76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:22.497820 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.497722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.498354 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.498337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:22.499822 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.499800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-default-certificate\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.499908 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.499847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-stats-auth\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.505895 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.505874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szmg8\" (UniqueName: \"kubernetes.io/projected/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-kube-api-access-szmg8\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:22.506511 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:22.506490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9kq\" (UniqueName: \"kubernetes.io/projected/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-kube-api-access-ln9kq\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:23.001940 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:23.001907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:23.001940 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:23.001941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:23.002308 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:23.001988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:23.002308 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:23.002077 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:24.002062858 +0000 UTC m=+91.526726390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:23.002308 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:23.002150 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:23.002308 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:23.002174 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:23.002308 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:23.002214 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:24.002195616 +0000 UTC m=+91.526859149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : secret "router-metrics-certs-default" not found Apr 22 18:48:23.002308 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:23.002238 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls podName:76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:24.002226949 +0000 UTC m=+91.526890478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7fshx" (UID: "76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:24.009563 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:24.009513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:24.009630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:24.009661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:24.009673 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:24.009746 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls podName:76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:26.009728054 +0000 UTC m=+93.534391596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7fshx" (UID: "76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:24.009758 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:24.009785 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:26.00976975 +0000 UTC m=+93.534433278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:24.009947 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:24.009811 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:26.009804907 +0000 UTC m=+93.534468435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : secret "router-metrics-certs-default" not found Apr 22 18:48:26.024064 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:26.024012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:26.024064 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:26.024064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:26.024573 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:26.024115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:26.024573 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:26.024208 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.024186021 +0000 UTC m=+97.548849554 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:26.024573 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:26.024251 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:26.024573 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:26.024287 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:26.024573 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:26.024330 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls podName:76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.024317996 +0000 UTC m=+97.548981541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7fshx" (UID: "76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:26.024573 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:26.024371 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.024349126 +0000 UTC m=+97.549012661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : secret "router-metrics-certs-default" not found Apr 22 18:48:28.743039 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:28.743010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k85gs_77f410ec-92d3-4e11-871d-3bd6da0e0d1f/dns-node-resolver/0.log" Apr 22 18:48:29.222278 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.222245 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6"] Apr 22 18:48:29.225439 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.225423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.227997 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.227974 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hwch9\"" Apr 22 18:48:29.227997 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.227985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:48:29.228181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.228018 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:29.229019 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.229002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:29.235052 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.235033 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6"] Apr 22 18:48:29.347261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.347222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8gp8\" (UniqueName: \"kubernetes.io/projected/81219394-9b4e-4e9d-a98d-d0fd92f6277d-kube-api-access-j8gp8\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.347261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.347284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.447799 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.447771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8gp8\" (UniqueName: \"kubernetes.io/projected/81219394-9b4e-4e9d-a98d-d0fd92f6277d-kube-api-access-j8gp8\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.447890 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.447812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.447963 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:29.447951 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:29.448008 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:29.448001 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls podName:81219394-9b4e-4e9d-a98d-d0fd92f6277d nodeName:}" failed. No retries permitted until 2026-04-22 18:48:29.947987556 +0000 UTC m=+97.472651085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5xwp6" (UID: "81219394-9b4e-4e9d-a98d-d0fd92f6277d") : secret "samples-operator-tls" not found Apr 22 18:48:29.458546 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.458520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8gp8\" (UniqueName: \"kubernetes.io/projected/81219394-9b4e-4e9d-a98d-d0fd92f6277d-kube-api-access-j8gp8\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.545676 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.545610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5t6hp_023c27e3-86e9-4182-bf8f-c7b6197bc958/node-ca/0.log" Apr 22 18:48:29.952959 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:29.952858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:29.953482 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:29.952988 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:29.953482 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:29.953060 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls podName:81219394-9b4e-4e9d-a98d-d0fd92f6277d nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.953038387 +0000 UTC m=+98.477701919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5xwp6" (UID: "81219394-9b4e-4e9d-a98d-d0fd92f6277d") : secret "samples-operator-tls" not found Apr 22 18:48:30.053414 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.053374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:30.053559 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.053425 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:30.053559 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.053494 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls podName:76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:38.053478976 +0000 UTC m=+105.578142510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7fshx" (UID: "76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:30.053559 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.053525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:30.053559 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.053544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:30.053694 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.053630 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:30.053694 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.053643 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:38.053630795 +0000 UTC m=+105.578294323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:30.053694 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.053663 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:38.053652706 +0000 UTC m=+105.578316236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : secret "router-metrics-certs-default" not found Apr 22 18:48:30.220286 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.220234 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx"] Apr 22 18:48:30.223127 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.223104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" Apr 22 18:48:30.225532 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.225513 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:30.226533 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.226516 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-kws7z\"" Apr 22 18:48:30.226586 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.226551 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:30.230089 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.230069 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx"] Apr 22 18:48:30.256151 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.256124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:48:30.256322 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.256155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:48:30.256322 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.256296 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:30.256406 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.256322 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:30.256406 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.256376 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls podName:130369e7-d304-4500-9ad6-18b8f2f4e731 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:34.256360184 +0000 UTC m=+161.781023718 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls") pod "dns-default-68s2k" (UID: "130369e7-d304-4500-9ad6-18b8f2f4e731") : secret "dns-default-metrics-tls" not found Apr 22 18:48:30.256406 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.256391 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert podName:be593d71-f465-4468-8034-246bf4c51e73 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:34.256385392 +0000 UTC m=+161.781048922 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert") pod "ingress-canary-qnb4b" (UID: "be593d71-f465-4468-8034-246bf4c51e73") : secret "canary-serving-cert" not found Apr 22 18:48:30.356681 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.356642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57zg\" (UniqueName: \"kubernetes.io/projected/86d47618-2f32-40db-a2c3-e3a19e106b16-kube-api-access-v57zg\") pod \"volume-data-source-validator-7c6cbb6c87-p8mkx\" (UID: \"86d47618-2f32-40db-a2c3-e3a19e106b16\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" Apr 22 18:48:30.457518 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.457467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v57zg\" (UniqueName: \"kubernetes.io/projected/86d47618-2f32-40db-a2c3-e3a19e106b16-kube-api-access-v57zg\") pod \"volume-data-source-validator-7c6cbb6c87-p8mkx\" (UID: \"86d47618-2f32-40db-a2c3-e3a19e106b16\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" Apr 22 18:48:30.465404 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.465384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57zg\" (UniqueName: \"kubernetes.io/projected/86d47618-2f32-40db-a2c3-e3a19e106b16-kube-api-access-v57zg\") pod \"volume-data-source-validator-7c6cbb6c87-p8mkx\" (UID: \"86d47618-2f32-40db-a2c3-e3a19e106b16\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" Apr 22 18:48:30.532380 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.532301 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" Apr 22 18:48:30.653140 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.653101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx"] Apr 22 18:48:30.656763 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:48:30.656733 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d47618_2f32_40db_a2c3_e3a19e106b16.slice/crio-24e08dd2d8a89ceda48ab3345abc29e657df3588c02cd511cf04137b09f17148 WatchSource:0}: Error finding container 24e08dd2d8a89ceda48ab3345abc29e657df3588c02cd511cf04137b09f17148: Status 404 returned error can't find the container with id 24e08dd2d8a89ceda48ab3345abc29e657df3588c02cd511cf04137b09f17148 Apr 22 18:48:30.962383 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:30.962351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:30.962743 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.962502 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:30.962743 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:30.962568 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls podName:81219394-9b4e-4e9d-a98d-d0fd92f6277d nodeName:}" failed. No retries permitted until 2026-04-22 18:48:32.962551901 +0000 UTC m=+100.487215444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5xwp6" (UID: "81219394-9b4e-4e9d-a98d-d0fd92f6277d") : secret "samples-operator-tls" not found Apr 22 18:48:31.220004 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.219911 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-l4kjm"] Apr 22 18:48:31.222788 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.222766 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.225182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.225157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:48:31.225311 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.225233 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jrdtp\"" Apr 22 18:48:31.225431 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.225414 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:31.226524 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.226500 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:31.226524 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.226493 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:48:31.231455 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.231282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:48:31.232140 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.232117 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-l4kjm"] Apr 22 18:48:31.265837 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.265778 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/407ba526-67b3-4fe5-9bc6-2c9894fb034f-trusted-ca\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.265837 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.265826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rdh\" (UniqueName: \"kubernetes.io/projected/407ba526-67b3-4fe5-9bc6-2c9894fb034f-kube-api-access-c9rdh\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.266053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.265937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407ba526-67b3-4fe5-9bc6-2c9894fb034f-config\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.266053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.265989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407ba526-67b3-4fe5-9bc6-2c9894fb034f-serving-cert\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.281374 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.281332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" event={"ID":"86d47618-2f32-40db-a2c3-e3a19e106b16","Type":"ContainerStarted","Data":"24e08dd2d8a89ceda48ab3345abc29e657df3588c02cd511cf04137b09f17148"} Apr 22 18:48:31.367261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.367220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407ba526-67b3-4fe5-9bc6-2c9894fb034f-serving-cert\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.367463 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.367384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/407ba526-67b3-4fe5-9bc6-2c9894fb034f-trusted-ca\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.367463 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.367413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rdh\" (UniqueName: \"kubernetes.io/projected/407ba526-67b3-4fe5-9bc6-2c9894fb034f-kube-api-access-c9rdh\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.367578 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.367466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407ba526-67b3-4fe5-9bc6-2c9894fb034f-config\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.371512 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.368468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407ba526-67b3-4fe5-9bc6-2c9894fb034f-config\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.371512 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.368745 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/407ba526-67b3-4fe5-9bc6-2c9894fb034f-trusted-ca\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.371512 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.370645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407ba526-67b3-4fe5-9bc6-2c9894fb034f-serving-cert\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.376188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.376163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rdh\" (UniqueName: \"kubernetes.io/projected/407ba526-67b3-4fe5-9bc6-2c9894fb034f-kube-api-access-c9rdh\") pod \"console-operator-9d4b6777b-l4kjm\" (UID: \"407ba526-67b3-4fe5-9bc6-2c9894fb034f\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.534333 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.534233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:31.652032 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:31.651997 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-l4kjm"] Apr 22 18:48:31.930207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:48:31.930130 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod407ba526_67b3_4fe5_9bc6_2c9894fb034f.slice/crio-61bbb30813248c0e726a1a77b0f6e3c9697aaac33c073535f2702e2a178b5b3d WatchSource:0}: Error finding container 61bbb30813248c0e726a1a77b0f6e3c9697aaac33c073535f2702e2a178b5b3d: Status 404 returned error can't find the container with id 61bbb30813248c0e726a1a77b0f6e3c9697aaac33c073535f2702e2a178b5b3d Apr 22 18:48:32.226049 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.226015 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc"] Apr 22 18:48:32.228768 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.228753 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.231246 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.231221 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:32.231400 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.231224 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:32.231400 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.231317 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:48:32.231400 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.231334 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-rn5ht\"" Apr 22 18:48:32.231568 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.231554 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:48:32.236943 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.236922 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc"] Apr 22 18:48:32.275076 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.275039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sldqs\" (UniqueName: \"kubernetes.io/projected/20a2b785-7a65-4033-ae4d-0275a248aec8-kube-api-access-sldqs\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.275240 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.275098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2b785-7a65-4033-ae4d-0275a248aec8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.275240 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.275131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a2b785-7a65-4033-ae4d-0275a248aec8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.284539 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.284507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" event={"ID":"86d47618-2f32-40db-a2c3-e3a19e106b16","Type":"ContainerStarted","Data":"b79dbe3d6afd8b54cc1632871a35c250307ef16323fd6139a69a8db9a12ed9b0"} Apr 22 18:48:32.285493 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.285474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" event={"ID":"407ba526-67b3-4fe5-9bc6-2c9894fb034f","Type":"ContainerStarted","Data":"61bbb30813248c0e726a1a77b0f6e3c9697aaac33c073535f2702e2a178b5b3d"} Apr 22 18:48:32.298165 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.298128 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p8mkx" podStartSLOduration=0.98296908 podStartE2EDuration="2.298114024s" podCreationTimestamp="2026-04-22 18:48:30 +0000 UTC" firstStartedPulling="2026-04-22 18:48:30.658582662 +0000 UTC m=+98.183246191" lastFinishedPulling="2026-04-22 18:48:31.973727593 +0000 UTC m=+99.498391135" observedRunningTime="2026-04-22 18:48:32.297385279 +0000 UTC m=+99.822048855" watchObservedRunningTime="2026-04-22 18:48:32.298114024 +0000 UTC m=+99.822777576" Apr 22 18:48:32.376497 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.376456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sldqs\" (UniqueName: \"kubernetes.io/projected/20a2b785-7a65-4033-ae4d-0275a248aec8-kube-api-access-sldqs\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.376647 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.376542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2b785-7a65-4033-ae4d-0275a248aec8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.376647 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.376579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a2b785-7a65-4033-ae4d-0275a248aec8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.377171 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.377145 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a2b785-7a65-4033-ae4d-0275a248aec8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.378753 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.378735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2b785-7a65-4033-ae4d-0275a248aec8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.384022 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.384003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sldqs\" (UniqueName: \"kubernetes.io/projected/20a2b785-7a65-4033-ae4d-0275a248aec8-kube-api-access-sldqs\") pod \"kube-storage-version-migrator-operator-6769c5d45-gkbcc\" (UID: \"20a2b785-7a65-4033-ae4d-0275a248aec8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.537962 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.537883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" Apr 22 18:48:32.672626 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.672590 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc"] Apr 22 18:48:32.676631 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:48:32.676598 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a2b785_7a65_4033_ae4d_0275a248aec8.slice/crio-cce96a1cbc60b6b9256dba082ba0d6cba48a26869cdbb30458d5850e44df59cd WatchSource:0}: Error finding container cce96a1cbc60b6b9256dba082ba0d6cba48a26869cdbb30458d5850e44df59cd: Status 404 returned error can't find the container with id cce96a1cbc60b6b9256dba082ba0d6cba48a26869cdbb30458d5850e44df59cd Apr 22 18:48:32.981693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:32.981656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:32.981857 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:32.981825 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:32.981916 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:32.981901 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls podName:81219394-9b4e-4e9d-a98d-d0fd92f6277d nodeName:}" failed. No retries permitted until 2026-04-22 18:48:36.981883564 +0000 UTC m=+104.506547094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5xwp6" (UID: "81219394-9b4e-4e9d-a98d-d0fd92f6277d") : secret "samples-operator-tls" not found Apr 22 18:48:33.230697 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:33.230657 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-h4b8s" Apr 22 18:48:33.289077 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:33.288998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" event={"ID":"20a2b785-7a65-4033-ae4d-0275a248aec8","Type":"ContainerStarted","Data":"cce96a1cbc60b6b9256dba082ba0d6cba48a26869cdbb30458d5850e44df59cd"} Apr 22 18:48:35.293432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:35.293402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/0.log" Apr 22 18:48:35.293777 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:35.293442 2577 generic.go:358] "Generic (PLEG): container finished" podID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" containerID="422b64707d00ffcb1e24bd1e277d3856cd76ca26936ce53a1bfeeaa5d5eb5ff7" exitCode=255 Apr 22 18:48:35.293777 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:35.293473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" event={"ID":"407ba526-67b3-4fe5-9bc6-2c9894fb034f","Type":"ContainerDied","Data":"422b64707d00ffcb1e24bd1e277d3856cd76ca26936ce53a1bfeeaa5d5eb5ff7"} Apr 22 18:48:35.293777 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:35.293734 2577 scope.go:117] "RemoveContainer" containerID="422b64707d00ffcb1e24bd1e277d3856cd76ca26936ce53a1bfeeaa5d5eb5ff7" Apr 22 18:48:36.281647 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.281611 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7"] Apr 22 18:48:36.283352 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.283328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" Apr 22 18:48:36.285708 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.285686 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-kgcfh\"" Apr 22 18:48:36.291783 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.291759 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7"] Apr 22 18:48:36.296940 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.296923 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/1.log" Apr 22 18:48:36.297305 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.297289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/0.log" Apr 22 18:48:36.297377 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.297330 2577 generic.go:358] "Generic (PLEG): container finished" podID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" containerID="d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5" exitCode=255 Apr 22 18:48:36.297443 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.297414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" event={"ID":"407ba526-67b3-4fe5-9bc6-2c9894fb034f","Type":"ContainerDied","Data":"d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5"} Apr 22 18:48:36.297484 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.297463 2577 scope.go:117] "RemoveContainer" containerID="422b64707d00ffcb1e24bd1e277d3856cd76ca26936ce53a1bfeeaa5d5eb5ff7" Apr 22 18:48:36.297664 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.297648 2577 scope.go:117] "RemoveContainer" containerID="d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5" Apr 22 18:48:36.297875 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:36.297856 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4kjm_openshift-console-operator(407ba526-67b3-4fe5-9bc6-2c9894fb034f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podUID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" Apr 22 18:48:36.298743 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.298719 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" event={"ID":"20a2b785-7a65-4033-ae4d-0275a248aec8","Type":"ContainerStarted","Data":"eaa6accaa14ae9b7ec2db273cfb303fae3ccc92271090875d27453ea01e63cd3"} Apr 22 18:48:36.331745 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.331690 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" podStartSLOduration=1.575818259 podStartE2EDuration="4.331675441s" podCreationTimestamp="2026-04-22 18:48:32 +0000 UTC" firstStartedPulling="2026-04-22 18:48:32.678781578 +0000 UTC m=+100.203445109" lastFinishedPulling="2026-04-22 18:48:35.434638758 +0000 UTC m=+102.959302291" observedRunningTime="2026-04-22 18:48:36.331366492 +0000 UTC m=+103.856030044" watchObservedRunningTime="2026-04-22 18:48:36.331675441 +0000 UTC m=+103.856338993" Apr 22 18:48:36.408803 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.408747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsr6l\" (UniqueName: \"kubernetes.io/projected/4630660b-1003-4863-8be0-f42b5940db5c-kube-api-access-lsr6l\") pod \"network-check-source-8894fc9bd-qlzm7\" (UID: \"4630660b-1003-4863-8be0-f42b5940db5c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" Apr 22 18:48:36.510215 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.510154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr6l\" (UniqueName: \"kubernetes.io/projected/4630660b-1003-4863-8be0-f42b5940db5c-kube-api-access-lsr6l\") pod \"network-check-source-8894fc9bd-qlzm7\" (UID: \"4630660b-1003-4863-8be0-f42b5940db5c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" Apr 22 18:48:36.517927 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.517897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr6l\" (UniqueName: \"kubernetes.io/projected/4630660b-1003-4863-8be0-f42b5940db5c-kube-api-access-lsr6l\") pod \"network-check-source-8894fc9bd-qlzm7\" (UID: \"4630660b-1003-4863-8be0-f42b5940db5c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" Apr 22 18:48:36.592281 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.592185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" Apr 22 18:48:36.700823 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:36.700791 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7"] Apr 22 18:48:36.703650 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:48:36.703622 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4630660b_1003_4863_8be0_f42b5940db5c.slice/crio-ec63f604bfad3fbe829f5938839ab4bdd382e6fd2ab858880842395bec7c88b5 WatchSource:0}: Error finding container ec63f604bfad3fbe829f5938839ab4bdd382e6fd2ab858880842395bec7c88b5: Status 404 returned error can't find the container with id ec63f604bfad3fbe829f5938839ab4bdd382e6fd2ab858880842395bec7c88b5 Apr 22 18:48:37.013878 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:37.013842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:37.014109 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:37.014005 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:37.014109 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:37.014091 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls podName:81219394-9b4e-4e9d-a98d-d0fd92f6277d nodeName:}" failed. No retries permitted until 2026-04-22 18:48:45.014069304 +0000 UTC m=+112.538732849 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5xwp6" (UID: "81219394-9b4e-4e9d-a98d-d0fd92f6277d") : secret "samples-operator-tls" not found Apr 22 18:48:37.302433 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:37.302356 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/1.log" Apr 22 18:48:37.302993 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:37.302729 2577 scope.go:117] "RemoveContainer" containerID="d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5" Apr 22 18:48:37.302993 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:37.302914 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4kjm_openshift-console-operator(407ba526-67b3-4fe5-9bc6-2c9894fb034f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podUID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" Apr 22 18:48:37.303798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:37.303772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" event={"ID":"4630660b-1003-4863-8be0-f42b5940db5c","Type":"ContainerStarted","Data":"9543009998cf97d9b6d0a3870a78551eb9eca02f1eb646fce9ab9171a70dae49"} Apr 22 18:48:37.303876 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:37.303809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" event={"ID":"4630660b-1003-4863-8be0-f42b5940db5c","Type":"ContainerStarted","Data":"ec63f604bfad3fbe829f5938839ab4bdd382e6fd2ab858880842395bec7c88b5"} Apr 22 18:48:37.331748 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:37.331704 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qlzm7" podStartSLOduration=1.3316914930000001 podStartE2EDuration="1.331691493s" podCreationTimestamp="2026-04-22 18:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:37.330915439 +0000 UTC m=+104.855578991" watchObservedRunningTime="2026-04-22 18:48:37.331691493 +0000 UTC m=+104.856355043" Apr 22 18:48:38.122024 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:38.121978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:38.122024 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:38.122022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:38.122242 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:38.122114 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:38.122242 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:38.122147 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.122127749 +0000 UTC m=+121.646791283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:38.122242 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:38.122180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:38.122375 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:38.122245 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs podName:a14c1def-cf59-4fe4-a62f-26d8cc86cd77 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.122237367 +0000 UTC m=+121.646900897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs") pod "router-default-9c6c9466d-5zwxc" (UID: "a14c1def-cf59-4fe4-a62f-26d8cc86cd77") : secret "router-metrics-certs-default" not found Apr 22 18:48:38.122375 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:38.122256 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:38.122375 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:38.122321 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls podName:76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.1223101 +0000 UTC m=+121.646973633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7fshx" (UID: "76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:41.535017 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:41.534987 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:41.535017 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:41.535020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:48:41.535409 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:41.535352 2577 scope.go:117] "RemoveContainer" containerID="d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5" Apr 22 18:48:41.535530 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:41.535512 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4kjm_openshift-console-operator(407ba526-67b3-4fe5-9bc6-2c9894fb034f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podUID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" Apr 22 18:48:45.078786 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:45.078731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:45.081137 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:45.081107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81219394-9b4e-4e9d-a98d-d0fd92f6277d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5xwp6\" (UID: \"81219394-9b4e-4e9d-a98d-d0fd92f6277d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:45.134111 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:45.134084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" Apr 22 18:48:45.244163 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:45.244132 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6"] Apr 22 18:48:45.320503 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:45.320475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" event={"ID":"81219394-9b4e-4e9d-a98d-d0fd92f6277d","Type":"ContainerStarted","Data":"9719515a71d81139df426a02cbf47b2ea7ca1fbce962f140fcaee7b8b1f735f2"} Apr 22 18:48:47.326291 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:47.326238 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" event={"ID":"81219394-9b4e-4e9d-a98d-d0fd92f6277d","Type":"ContainerStarted","Data":"c7be2245c13d2693b3932f994f11d64a5a33f0d8fa2049e450f275fb4724929c"} Apr 22 18:48:47.326291 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:47.326286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" event={"ID":"81219394-9b4e-4e9d-a98d-d0fd92f6277d","Type":"ContainerStarted","Data":"66068f73740e4c0da106223fdeba4e8df001632a0b49bd55e2b5f7ab56845073"} Apr 22 18:48:47.341478 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:47.341431 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5xwp6" podStartSLOduration=16.685968564 podStartE2EDuration="18.341417799s" podCreationTimestamp="2026-04-22 18:48:29 +0000 UTC" firstStartedPulling="2026-04-22 18:48:45.287588511 +0000 UTC m=+112.812252041" lastFinishedPulling="2026-04-22 18:48:46.943037743 +0000 UTC m=+114.467701276" observedRunningTime="2026-04-22 18:48:47.34078814 +0000 UTC m=+114.865451693" watchObservedRunningTime="2026-04-22 18:48:47.341417799 +0000 UTC m=+114.866081350" Apr 22 18:48:53.986700 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:53.986670 2577 scope.go:117] "RemoveContainer" containerID="d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5" Apr 22 18:48:54.151585 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.151551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:54.151777 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.151734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:54.151777 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.151770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:54.152475 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.152451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-service-ca-bundle\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:54.154017 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.153998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7fshx\" (UID: \"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:54.154222 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.154201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a14c1def-cf59-4fe4-a62f-26d8cc86cd77-metrics-certs\") pod \"router-default-9c6c9466d-5zwxc\" (UID: \"a14c1def-cf59-4fe4-a62f-26d8cc86cd77\") " pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:54.345499 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.345424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:48:54.345778 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.345762 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/1.log" Apr 22 18:48:54.345830 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.345796 2577 generic.go:358] "Generic (PLEG): container finished" podID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" containerID="01fc3b860886a61fe9e6cfce2f6c9cd3069a5f4a8d3a4a22ab404f174d249e42" exitCode=255 Apr 22 18:48:54.345871 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.345855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" event={"ID":"407ba526-67b3-4fe5-9bc6-2c9894fb034f","Type":"ContainerDied","Data":"01fc3b860886a61fe9e6cfce2f6c9cd3069a5f4a8d3a4a22ab404f174d249e42"} Apr 22 18:48:54.345910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.345896 2577 scope.go:117] "RemoveContainer" containerID="d493bca8080642e45b235a8a95349fa3396eeb77ed485f6f863bbe8770740df5" Apr 22 18:48:54.346222 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.346205 2577 scope.go:117] "RemoveContainer" containerID="01fc3b860886a61fe9e6cfce2f6c9cd3069a5f4a8d3a4a22ab404f174d249e42" Apr 22 18:48:54.346439 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:48:54.346420 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4kjm_openshift-console-operator(407ba526-67b3-4fe5-9bc6-2c9894fb034f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podUID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" Apr 22 18:48:54.395682 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.395651 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-skxpn\"" Apr 22 18:48:54.402975 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.402954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kx5jk\"" Apr 22 18:48:54.403986 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.403968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" Apr 22 18:48:54.410943 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.410826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:54.527212 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.527182 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx"] Apr 22 18:48:54.530325 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:48:54.530300 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76eb6fb3_6155_4b2e_86d2_e26d23bbb6f3.slice/crio-dcd656c0bb7fce8d7bde7ed8afab5486203d609bec0ebd159a26c061d18bd3f1 WatchSource:0}: Error finding container dcd656c0bb7fce8d7bde7ed8afab5486203d609bec0ebd159a26c061d18bd3f1: Status 404 returned error can't find the container with id dcd656c0bb7fce8d7bde7ed8afab5486203d609bec0ebd159a26c061d18bd3f1 Apr 22 18:48:54.546523 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:48:54.546492 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda14c1def_cf59_4fe4_a62f_26d8cc86cd77.slice/crio-be92f7e424cdd9afba93295bfd00afd87a6d3c8592cea4b4e3eb324dfdbe544f WatchSource:0}: Error finding container be92f7e424cdd9afba93295bfd00afd87a6d3c8592cea4b4e3eb324dfdbe544f: Status 404 returned error can't find the container with id be92f7e424cdd9afba93295bfd00afd87a6d3c8592cea4b4e3eb324dfdbe544f Apr 22 18:48:54.558260 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:54.558238 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-9c6c9466d-5zwxc"] Apr 22 18:48:55.349579 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.349529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" event={"ID":"a14c1def-cf59-4fe4-a62f-26d8cc86cd77","Type":"ContainerStarted","Data":"bfc23ab28026385e5fe47542bebc0f45c79f1efd9c2e19393e33d33f43b6a0eb"} Apr 22 18:48:55.349579 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.349581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" event={"ID":"a14c1def-cf59-4fe4-a62f-26d8cc86cd77","Type":"ContainerStarted","Data":"be92f7e424cdd9afba93295bfd00afd87a6d3c8592cea4b4e3eb324dfdbe544f"} Apr 22 18:48:55.351057 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.351023 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:48:55.352207 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.352167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" event={"ID":"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3","Type":"ContainerStarted","Data":"dcd656c0bb7fce8d7bde7ed8afab5486203d609bec0ebd159a26c061d18bd3f1"} Apr 22 18:48:55.367860 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.367815 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" podStartSLOduration=33.367802515 podStartE2EDuration="33.367802515s" podCreationTimestamp="2026-04-22 18:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:55.366949485 +0000 UTC m=+122.891613058" watchObservedRunningTime="2026-04-22 18:48:55.367802515 +0000 UTC m=+122.892466044" Apr 22 18:48:55.411466 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.411431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:55.414762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:55.414736 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:56.355884 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:56.355797 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" event={"ID":"76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3","Type":"ContainerStarted","Data":"0ae8f7b720307c5191b0df69b21a85ac7da700f1e7ad3d5e5f848bcbaf49f154"} Apr 22 18:48:56.356335 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:56.356006 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:56.357150 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:56.357133 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-9c6c9466d-5zwxc" Apr 22 18:48:56.373111 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:48:56.373060 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7fshx" podStartSLOduration=32.84543876 podStartE2EDuration="34.37304434s" podCreationTimestamp="2026-04-22 18:48:22 +0000 UTC" firstStartedPulling="2026-04-22 18:48:54.532127269 +0000 UTC m=+122.056790802" lastFinishedPulling="2026-04-22 18:48:56.05973285 +0000 UTC m=+123.584396382" observedRunningTime="2026-04-22 18:48:56.370953868 +0000 UTC m=+123.895617419" watchObservedRunningTime="2026-04-22 18:48:56.37304434 +0000 UTC m=+123.897707892" Apr 22 18:49:01.534952 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:01.534903 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:49:01.534952 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:01.534943 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:49:01.535381 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:01.535295 2577 scope.go:117] "RemoveContainer" containerID="01fc3b860886a61fe9e6cfce2f6c9cd3069a5f4a8d3a4a22ab404f174d249e42" Apr 22 18:49:01.535474 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:49:01.535456 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4kjm_openshift-console-operator(407ba526-67b3-4fe5-9bc6-2c9894fb034f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podUID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" Apr 22 18:49:02.055062 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.055027 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-597d994cdc-cvzrg"] Apr 22 18:49:02.057044 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.057023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.060668 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.060639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:49:02.060801 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.060679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:49:02.061600 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.061579 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8zvm4\"" Apr 22 18:49:02.062056 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.062038 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:49:02.065974 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.065953 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:49:02.070438 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.070418 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9gjnl"] Apr 22 18:49:02.072586 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.072565 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.073377 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.073354 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-597d994cdc-cvzrg"] Apr 22 18:49:02.075834 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.075817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:49:02.075934 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.075873 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6dkrn\"" Apr 22 18:49:02.076001 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.075937 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:49:02.076250 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.076234 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:02.076359 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.076250 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:02.087227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.087208 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9gjnl"] Apr 22 18:49:02.119106 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5147e527-afd1-402c-995b-814eebf64541-crio-socket\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.119258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119109 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5147e527-afd1-402c-995b-814eebf64541-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.119258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ddfe65-9d38-4b62-a7fc-877af5eec212-trusted-ca\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-bound-sa-token\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5147e527-afd1-402c-995b-814eebf64541-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.119258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/00ddfe65-9d38-4b62-a7fc-877af5eec212-registry-certificates\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/00ddfe65-9d38-4b62-a7fc-877af5eec212-installation-pull-secrets\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-registry-tls\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5147e527-afd1-402c-995b-814eebf64541-data-volume\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.119498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpd9v\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-kube-api-access-wpd9v\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djlv5\" (UniqueName: \"kubernetes.io/projected/5147e527-afd1-402c-995b-814eebf64541-kube-api-access-djlv5\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.119498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119482 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/00ddfe65-9d38-4b62-a7fc-877af5eec212-ca-trust-extracted\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.119698 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.119519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/00ddfe65-9d38-4b62-a7fc-877af5eec212-image-registry-private-configuration\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220029 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.219997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5147e527-afd1-402c-995b-814eebf64541-crio-socket\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220029 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5147e527-afd1-402c-995b-814eebf64541-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ddfe65-9d38-4b62-a7fc-877af5eec212-trusted-ca\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-bound-sa-token\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5147e527-afd1-402c-995b-814eebf64541-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5147e527-afd1-402c-995b-814eebf64541-crio-socket\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/00ddfe65-9d38-4b62-a7fc-877af5eec212-registry-certificates\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/00ddfe65-9d38-4b62-a7fc-877af5eec212-installation-pull-secrets\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-registry-tls\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5147e527-afd1-402c-995b-814eebf64541-data-volume\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220261 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpd9v\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-kube-api-access-wpd9v\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220785 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djlv5\" (UniqueName: \"kubernetes.io/projected/5147e527-afd1-402c-995b-814eebf64541-kube-api-access-djlv5\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220785 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/00ddfe65-9d38-4b62-a7fc-877af5eec212-ca-trust-extracted\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220785 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/00ddfe65-9d38-4b62-a7fc-877af5eec212-image-registry-private-configuration\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.220785 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5147e527-afd1-402c-995b-814eebf64541-data-volume\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.220974 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.220800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5147e527-afd1-402c-995b-814eebf64541-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.221147 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.221126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/00ddfe65-9d38-4b62-a7fc-877af5eec212-ca-trust-extracted\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.221322 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.221216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/00ddfe65-9d38-4b62-a7fc-877af5eec212-registry-certificates\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.221460 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.221440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ddfe65-9d38-4b62-a7fc-877af5eec212-trusted-ca\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.222908 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.222885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/00ddfe65-9d38-4b62-a7fc-877af5eec212-image-registry-private-configuration\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.223055 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.223033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5147e527-afd1-402c-995b-814eebf64541-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.223148 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.223131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-registry-tls\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.223321 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.223305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/00ddfe65-9d38-4b62-a7fc-877af5eec212-installation-pull-secrets\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.235234 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.235202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-bound-sa-token\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.236008 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.235981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djlv5\" (UniqueName: \"kubernetes.io/projected/5147e527-afd1-402c-995b-814eebf64541-kube-api-access-djlv5\") pod \"insights-runtime-extractor-9gjnl\" (UID: \"5147e527-afd1-402c-995b-814eebf64541\") " pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.236176 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.236161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpd9v\" (UniqueName: \"kubernetes.io/projected/00ddfe65-9d38-4b62-a7fc-877af5eec212-kube-api-access-wpd9v\") pod \"image-registry-597d994cdc-cvzrg\" (UID: \"00ddfe65-9d38-4b62-a7fc-877af5eec212\") " pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.366543 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.366457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:02.381592 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.381564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9gjnl" Apr 22 18:49:02.493032 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.493000 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-597d994cdc-cvzrg"] Apr 22 18:49:02.496207 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:02.496180 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ddfe65_9d38_4b62_a7fc_877af5eec212.slice/crio-54d1e1adb8e7a335d191826ae044538ccca9844e7d3fc8ed8ef076e99e179d05 WatchSource:0}: Error finding container 54d1e1adb8e7a335d191826ae044538ccca9844e7d3fc8ed8ef076e99e179d05: Status 404 returned error can't find the container with id 54d1e1adb8e7a335d191826ae044538ccca9844e7d3fc8ed8ef076e99e179d05 Apr 22 18:49:02.509024 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.509002 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9gjnl"] Apr 22 18:49:02.723860 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.723816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:49:02.726149 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.726129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c33a8222-6663-4971-9e27-d05681becacf-metrics-certs\") pod \"network-metrics-daemon-4q2cb\" (UID: \"c33a8222-6663-4971-9e27-d05681becacf\") " pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:49:02.798959 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.798931 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9rzz8\"" Apr 22 18:49:02.806872 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.806850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4q2cb" Apr 22 18:49:02.934409 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:02.934370 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4q2cb"] Apr 22 18:49:02.937940 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:02.937908 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc33a8222_6663_4971_9e27_d05681becacf.slice/crio-d7a680b89022870229d5468798aaa69cfe78bd5908b75f4ca34de04eef50588c WatchSource:0}: Error finding container d7a680b89022870229d5468798aaa69cfe78bd5908b75f4ca34de04eef50588c: Status 404 returned error can't find the container with id d7a680b89022870229d5468798aaa69cfe78bd5908b75f4ca34de04eef50588c Apr 22 18:49:03.375005 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.374918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4q2cb" event={"ID":"c33a8222-6663-4971-9e27-d05681becacf","Type":"ContainerStarted","Data":"d7a680b89022870229d5468798aaa69cfe78bd5908b75f4ca34de04eef50588c"} Apr 22 18:49:03.376698 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.376666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gjnl" event={"ID":"5147e527-afd1-402c-995b-814eebf64541","Type":"ContainerStarted","Data":"57188666c8c25d154c37f2149f5c62bef99d8b55a303b849817fe591feccaaf5"} Apr 22 18:49:03.376836 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.376704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gjnl" event={"ID":"5147e527-afd1-402c-995b-814eebf64541","Type":"ContainerStarted","Data":"27be649f2422ea89060ff660b39468500c333d06a84cb3333462b407f6bad1ec"} Apr 22 18:49:03.376836 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.376721 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gjnl" event={"ID":"5147e527-afd1-402c-995b-814eebf64541","Type":"ContainerStarted","Data":"4b1c286116236ca9491c5ddbfe1d14622aada412d6424a9fac69d72c4964082a"} Apr 22 18:49:03.378083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.378058 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" event={"ID":"00ddfe65-9d38-4b62-a7fc-877af5eec212","Type":"ContainerStarted","Data":"57b0a1d17d347926775f653a569bd5fbf290e577ac83f844fee274aa1d02b8e9"} Apr 22 18:49:03.378195 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.378087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" event={"ID":"00ddfe65-9d38-4b62-a7fc-877af5eec212","Type":"ContainerStarted","Data":"54d1e1adb8e7a335d191826ae044538ccca9844e7d3fc8ed8ef076e99e179d05"} Apr 22 18:49:03.378262 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.378250 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:03.396842 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:03.396791 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" podStartSLOduration=1.396773635 podStartE2EDuration="1.396773635s" podCreationTimestamp="2026-04-22 18:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:03.395480388 +0000 UTC m=+130.920143940" watchObservedRunningTime="2026-04-22 18:49:03.396773635 +0000 UTC m=+130.921437188" Apr 22 18:49:04.383024 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:04.382981 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4q2cb" event={"ID":"c33a8222-6663-4971-9e27-d05681becacf","Type":"ContainerStarted","Data":"4f934994acd9f445b594bc548fd3ef63f647ac3af69f4f325bd03ec2dfc60e06"} Apr 22 18:49:04.383024 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:04.383029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4q2cb" event={"ID":"c33a8222-6663-4971-9e27-d05681becacf","Type":"ContainerStarted","Data":"5471258970d8033bb5aad950b835cf2663b4c8c345e4f38185b4e08083f6f9dd"} Apr 22 18:49:05.388073 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:05.388035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gjnl" event={"ID":"5147e527-afd1-402c-995b-814eebf64541","Type":"ContainerStarted","Data":"60b188e4107fc2442d1cee17eccf1ca2d3fd6714285900ed32d9658db5bc901e"} Apr 22 18:49:05.404504 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:05.404457 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4q2cb" podStartSLOduration=131.511177331 podStartE2EDuration="2m12.404442765s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.939935311 +0000 UTC m=+130.464598839" lastFinishedPulling="2026-04-22 18:49:03.833200744 +0000 UTC m=+131.357864273" observedRunningTime="2026-04-22 18:49:04.397668568 +0000 UTC m=+131.922332119" watchObservedRunningTime="2026-04-22 18:49:05.404442765 +0000 UTC m=+132.929106330" Apr 22 18:49:05.405014 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:05.404990 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9gjnl" podStartSLOduration=1.409151625 podStartE2EDuration="3.404984744s" podCreationTimestamp="2026-04-22 18:49:02 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.558666932 +0000 UTC m=+130.083330461" lastFinishedPulling="2026-04-22 18:49:04.554500047 +0000 UTC m=+132.079163580" observedRunningTime="2026-04-22 18:49:05.403147523 +0000 UTC m=+132.927811074" watchObservedRunningTime="2026-04-22 18:49:05.404984744 +0000 UTC m=+132.929648332" Apr 22 18:49:08.948522 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.948479 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-chvkl"] Apr 22 18:49:08.950465 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.950440 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.954749 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.954725 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:49:08.954749 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.954737 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:49:08.954928 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.954794 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-shhl8\"" Apr 22 18:49:08.954928 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.954802 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:08.954928 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.954845 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:49:08.962183 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.962163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-chvkl"] Apr 22 18:49:08.970672 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.970648 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8mkqk"] Apr 22 18:49:08.972355 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/278d1366-e4d5-4510-97e7-454f852e755e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.972414 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.972414 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/278d1366-e4d5-4510-97e7-454f852e755e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.972510 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.972510 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx6dp\" (UniqueName: \"kubernetes.io/projected/278d1366-e4d5-4510-97e7-454f852e755e-kube-api-access-wx6dp\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.972510 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:08.972702 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.972688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:08.975374 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.975346 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:08.975610 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.975592 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:08.975885 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.975861 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:08.975978 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:08.975865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nltc8\"" Apr 22 18:49:09.073377 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqv5n\" (UniqueName: \"kubernetes.io/projected/21175012-ee46-4d06-8d03-22a7d3555566-kube-api-access-hqv5n\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073377 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-textfile\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073563 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-accelerators-collector-config\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073563 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-sys\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073563 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/278d1366-e4d5-4510-97e7-454f852e755e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.073563 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-root\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073563 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-wtmp\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/278d1366-e4d5-4510-97e7-454f852e755e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx6dp\" (UniqueName: \"kubernetes.io/projected/278d1366-e4d5-4510-97e7-454f852e755e-kube-api-access-wx6dp\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:49:09.073664 2577 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21175012-ee46-4d06-8d03-22a7d3555566-metrics-client-ca\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.073750 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:49:09.073715 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-tls podName:278d1366-e4d5-4510-97e7-454f852e755e nodeName:}" failed. No retries permitted until 2026-04-22 18:49:09.573695985 +0000 UTC m=+137.098359515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-chvkl" (UID: "278d1366-e4d5-4510-97e7-454f852e755e") : secret "kube-state-metrics-tls" not found Apr 22 18:49:09.074083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.074083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-tls\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.074083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.073882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.074083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.074010 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/278d1366-e4d5-4510-97e7-454f852e755e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.074308 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.074197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/278d1366-e4d5-4510-97e7-454f852e755e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.074425 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.074405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.076028 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.076009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.081705 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.081679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx6dp\" (UniqueName: \"kubernetes.io/projected/278d1366-e4d5-4510-97e7-454f852e755e-kube-api-access-wx6dp\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.174972 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.174934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqv5n\" (UniqueName: \"kubernetes.io/projected/21175012-ee46-4d06-8d03-22a7d3555566-kube-api-access-hqv5n\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.174972 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.174970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-textfile\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.174997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-accelerators-collector-config\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-sys\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-sys\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-root\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-wtmp\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-root\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21175012-ee46-4d06-8d03-22a7d3555566-metrics-client-ca\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-tls\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-wtmp\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175495 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-textfile\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175877 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-accelerators-collector-config\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.175877 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.175864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21175012-ee46-4d06-8d03-22a7d3555566-metrics-client-ca\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.177639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.177614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-tls\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.177722 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.177695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21175012-ee46-4d06-8d03-22a7d3555566-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.182798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.182775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqv5n\" (UniqueName: \"kubernetes.io/projected/21175012-ee46-4d06-8d03-22a7d3555566-kube-api-access-hqv5n\") pod \"node-exporter-8mkqk\" (UID: \"21175012-ee46-4d06-8d03-22a7d3555566\") " pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.281791 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.281761 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8mkqk" Apr 22 18:49:09.289432 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:09.289403 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21175012_ee46_4d06_8d03_22a7d3555566.slice/crio-8a5e36ec04b0c8591b075477b33f93ca100df334c8ccbf311bfc60a30fd819d9 WatchSource:0}: Error finding container 8a5e36ec04b0c8591b075477b33f93ca100df334c8ccbf311bfc60a30fd819d9: Status 404 returned error can't find the container with id 8a5e36ec04b0c8591b075477b33f93ca100df334c8ccbf311bfc60a30fd819d9 Apr 22 18:49:09.402737 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.402699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8mkqk" event={"ID":"21175012-ee46-4d06-8d03-22a7d3555566","Type":"ContainerStarted","Data":"8a5e36ec04b0c8591b075477b33f93ca100df334c8ccbf311bfc60a30fd819d9"} Apr 22 18:49:09.578044 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.577949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.580833 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.580802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/278d1366-e4d5-4510-97e7-454f852e755e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-chvkl\" (UID: \"278d1366-e4d5-4510-97e7-454f852e755e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:09.859358 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:09.859262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" Apr 22 18:49:10.002910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.002873 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-chvkl"] Apr 22 18:49:10.009650 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.009622 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:10.012733 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.012713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.015173 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015114 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:49:10.015316 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015255 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:49:10.015432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:49:10.015546 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015441 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:49:10.015546 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015501 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5tfxw\"" Apr 22 18:49:10.015671 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:49:10.015671 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015602 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:49:10.015803 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015716 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:49:10.015803 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015744 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:49:10.015803 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.015794 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:49:10.027249 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.027219 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:10.081188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-config-volume\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxvr\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-kube-api-access-fdxvr\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-web-config\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-config-out\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.081693 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.081580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.105526 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:10.105497 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278d1366_e4d5_4510_97e7_454f852e755e.slice/crio-fa0b1589702462016a2559e92d44886a6347a3b444509133adb1ceef07498183 WatchSource:0}: Error finding container fa0b1589702462016a2559e92d44886a6347a3b444509133adb1ceef07498183: Status 404 returned error can't find the container with id fa0b1589702462016a2559e92d44886a6347a3b444509133adb1ceef07498183 Apr 22 18:49:10.182689 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182804 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182804 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182804 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-config-volume\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxvr\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-kube-api-access-fdxvr\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.182963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-web-config\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.183162 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.182978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.183162 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.183016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.183162 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.183064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.183162 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.183105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-config-out\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.183162 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.183139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.183756 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.183456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.185159 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.184341 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.185159 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:49:10.184732 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle podName:c03cef71-3c3e-493f-b282-90922361220a nodeName:}" failed. No retries permitted until 2026-04-22 18:49:10.684710017 +0000 UTC m=+138.209373553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "c03cef71-3c3e-493f-b282-90922361220a") : configmap references non-existent config key: ca-bundle.crt Apr 22 18:49:10.186634 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.186240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.186745 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.186692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.186991 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.186968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.187279 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.187246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-config-volume\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.187383 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.187323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.187617 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.187590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-config-out\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.187692 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.187662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.187869 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.187848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.187963 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.187950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-web-config\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.190935 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.190911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxvr\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-kube-api-access-fdxvr\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.406753 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.406663 2577 generic.go:358] "Generic (PLEG): container finished" podID="21175012-ee46-4d06-8d03-22a7d3555566" containerID="31ee92c586a605cc871f91fd6f9ca2cbedcdf9713a10c1b611d29dc2373feeb3" exitCode=0 Apr 22 18:49:10.406906 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.406749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8mkqk" event={"ID":"21175012-ee46-4d06-8d03-22a7d3555566","Type":"ContainerDied","Data":"31ee92c586a605cc871f91fd6f9ca2cbedcdf9713a10c1b611d29dc2373feeb3"} Apr 22 18:49:10.407910 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.407887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" event={"ID":"278d1366-e4d5-4510-97e7-454f852e755e","Type":"ContainerStarted","Data":"fa0b1589702462016a2559e92d44886a6347a3b444509133adb1ceef07498183"} Apr 22 18:49:10.688372 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.688281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.688996 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.688974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:10.923322 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:10.923286 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:11.068844 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:11.068819 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:11.336928 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:11.336848 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc03cef71_3c3e_493f_b282_90922361220a.slice/crio-2ce2b3afab10c1e2be3c9ed68edfdba7027cf1cd267b7b4407cf127ed616393f WatchSource:0}: Error finding container 2ce2b3afab10c1e2be3c9ed68edfdba7027cf1cd267b7b4407cf127ed616393f: Status 404 returned error can't find the container with id 2ce2b3afab10c1e2be3c9ed68edfdba7027cf1cd267b7b4407cf127ed616393f Apr 22 18:49:11.413159 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:11.413131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8mkqk" event={"ID":"21175012-ee46-4d06-8d03-22a7d3555566","Type":"ContainerStarted","Data":"e4ce98ef2951133028eb655abbaa6ed579ec91348828f922e655d8dcec6056e3"} Apr 22 18:49:11.413322 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:11.413173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8mkqk" event={"ID":"21175012-ee46-4d06-8d03-22a7d3555566","Type":"ContainerStarted","Data":"6634ae7b1d9763a246d37a072a2122e340615aa35f16d590943ba220681a7f75"} Apr 22 18:49:11.414717 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:11.414618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" event={"ID":"278d1366-e4d5-4510-97e7-454f852e755e","Type":"ContainerStarted","Data":"69477723c1ec7b7cd5e30800c88525e4008982482fd669256c8a3493f9801213"} Apr 22 18:49:11.415690 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:11.415668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"2ce2b3afab10c1e2be3c9ed68edfdba7027cf1cd267b7b4407cf127ed616393f"} Apr 22 18:49:11.430122 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:11.430081 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8mkqk" podStartSLOduration=2.594023263 podStartE2EDuration="3.430066755s" podCreationTimestamp="2026-04-22 18:49:08 +0000 UTC" firstStartedPulling="2026-04-22 18:49:09.291176591 +0000 UTC m=+136.815840120" lastFinishedPulling="2026-04-22 18:49:10.127220084 +0000 UTC m=+137.651883612" observedRunningTime="2026-04-22 18:49:11.428674252 +0000 UTC m=+138.953337804" watchObservedRunningTime="2026-04-22 18:49:11.430066755 +0000 UTC m=+138.954730308" Apr 22 18:49:12.421177 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:12.421144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" event={"ID":"278d1366-e4d5-4510-97e7-454f852e755e","Type":"ContainerStarted","Data":"cf7021e9051edc4e7a33a0ed659ed5088261b8ef0b4b1c5c91212ca684c7de34"} Apr 22 18:49:12.421677 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:12.421189 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" event={"ID":"278d1366-e4d5-4510-97e7-454f852e755e","Type":"ContainerStarted","Data":"c5b859c94bbd79619fb1da3837f591828a888d8302cf0cbbb031e0f24891fd42"} Apr 22 18:49:12.439846 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:12.439788 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-chvkl" podStartSLOduration=3.190939091 podStartE2EDuration="4.439767464s" podCreationTimestamp="2026-04-22 18:49:08 +0000 UTC" firstStartedPulling="2026-04-22 18:49:10.107961744 +0000 UTC m=+137.632625273" lastFinishedPulling="2026-04-22 18:49:11.356790114 +0000 UTC m=+138.881453646" observedRunningTime="2026-04-22 18:49:12.437502761 +0000 UTC m=+139.962166311" watchObservedRunningTime="2026-04-22 18:49:12.439767464 +0000 UTC m=+139.964431016" Apr 22 18:49:13.253984 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.253948 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5bc5566b49-nwt7t"] Apr 22 18:49:13.255954 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.255939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.258420 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.258388 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dfsfpev6ddpd8\"" Apr 22 18:49:13.258420 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.258402 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:49:13.258701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.258387 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:49:13.258701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.258522 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:49:13.258701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.258619 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-4mmnv\"" Apr 22 18:49:13.259471 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.259454 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:49:13.264656 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.264634 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bc5566b49-nwt7t"] Apr 22 18:49:13.311400 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-secret-metrics-server-tls\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.311400 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5443706c-f0f0-4036-8c01-0faa5b4b7f57-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.311606 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311419 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-client-ca-bundle\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.311606 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-secret-metrics-server-client-certs\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.311606 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sthl\" (UniqueName: \"kubernetes.io/projected/5443706c-f0f0-4036-8c01-0faa5b4b7f57-kube-api-access-2sthl\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.311606 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5443706c-f0f0-4036-8c01-0faa5b4b7f57-metrics-server-audit-profiles\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.311748 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.311642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5443706c-f0f0-4036-8c01-0faa5b4b7f57-audit-log\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412007 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.411974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sthl\" (UniqueName: \"kubernetes.io/projected/5443706c-f0f0-4036-8c01-0faa5b4b7f57-kube-api-access-2sthl\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5443706c-f0f0-4036-8c01-0faa5b4b7f57-metrics-server-audit-profiles\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412066 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5443706c-f0f0-4036-8c01-0faa5b4b7f57-audit-log\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-secret-metrics-server-tls\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5443706c-f0f0-4036-8c01-0faa5b4b7f57-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-client-ca-bundle\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412419 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-secret-metrics-server-client-certs\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412593 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5443706c-f0f0-4036-8c01-0faa5b4b7f57-audit-log\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.412884 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.412863 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5443706c-f0f0-4036-8c01-0faa5b4b7f57-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.413148 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.413124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5443706c-f0f0-4036-8c01-0faa5b4b7f57-metrics-server-audit-profiles\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.414682 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.414652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-secret-metrics-server-tls\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.414804 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.414782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-client-ca-bundle\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.414863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.414789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5443706c-f0f0-4036-8c01-0faa5b4b7f57-secret-metrics-server-client-certs\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.420778 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.420756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sthl\" (UniqueName: \"kubernetes.io/projected/5443706c-f0f0-4036-8c01-0faa5b4b7f57-kube-api-access-2sthl\") pod \"metrics-server-5bc5566b49-nwt7t\" (UID: \"5443706c-f0f0-4036-8c01-0faa5b4b7f57\") " pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.427364 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.427341 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393" exitCode=0 Apr 22 18:49:13.427633 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.427425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393"} Apr 22 18:49:13.565748 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.565669 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:13.687798 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:13.687697 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bc5566b49-nwt7t"] Apr 22 18:49:13.690445 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:13.690418 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5443706c_f0f0_4036_8c01_0faa5b4b7f57.slice/crio-6c73d833a84667e976e7e7832052da5418b7a3f79e6b3faea3e12a9b9fb396a7 WatchSource:0}: Error finding container 6c73d833a84667e976e7e7832052da5418b7a3f79e6b3faea3e12a9b9fb396a7: Status 404 returned error can't find the container with id 6c73d833a84667e976e7e7832052da5418b7a3f79e6b3faea3e12a9b9fb396a7 Apr 22 18:49:14.226957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.226916 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr"] Apr 22 18:49:14.229578 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.229546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.232171 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.232128 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:49:14.232171 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.232135 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:49:14.232352 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.232138 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:49:14.232352 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.232223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7xbj9\"" Apr 22 18:49:14.232446 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.232353 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:49:14.233458 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.233436 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:49:14.238342 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.238188 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:49:14.241453 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.241432 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr"] Apr 22 18:49:14.326562 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-federate-client-tls\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326752 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-telemeter-client-tls\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326752 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326752 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz56v\" (UniqueName: \"kubernetes.io/projected/f2809948-8435-496d-ae4b-0791482f79c7-kube-api-access-dz56v\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326752 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-serving-certs-ca-bundle\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326959 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-secret-telemeter-client\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326959 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.326959 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.326806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-metrics-client-ca\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.427689 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz56v\" (UniqueName: \"kubernetes.io/projected/f2809948-8435-496d-ae4b-0791482f79c7-kube-api-access-dz56v\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-serving-certs-ca-bundle\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-secret-telemeter-client\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-metrics-client-ca\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.427934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-federate-client-tls\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.428020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-telemeter-client-tls\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.428538 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.428515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-serving-certs-ca-bundle\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.429227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.429181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.429396 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.429370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2809948-8435-496d-ae4b-0791482f79c7-metrics-client-ca\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.430735 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.430708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.431466 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.431441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-secret-telemeter-client\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.431796 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.431770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-telemeter-client-tls\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.432673 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.432643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" event={"ID":"5443706c-f0f0-4036-8c01-0faa5b4b7f57","Type":"ContainerStarted","Data":"6c73d833a84667e976e7e7832052da5418b7a3f79e6b3faea3e12a9b9fb396a7"} Apr 22 18:49:14.433189 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.433167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f2809948-8435-496d-ae4b-0791482f79c7-federate-client-tls\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.436088 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.436067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz56v\" (UniqueName: \"kubernetes.io/projected/f2809948-8435-496d-ae4b-0791482f79c7-kube-api-access-dz56v\") pod \"telemeter-client-6d58b4f4bd-fgwqr\" (UID: \"f2809948-8435-496d-ae4b-0791482f79c7\") " pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.542704 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.542620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" Apr 22 18:49:14.696107 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:14.695988 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr"] Apr 22 18:49:14.700646 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:14.700616 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2809948_8435_496d_ae4b_0791482f79c7.slice/crio-bd59461d9aca86c5c814b11d3b8747795b65b3e865a2db5bf6d07fbb947f78c2 WatchSource:0}: Error finding container bd59461d9aca86c5c814b11d3b8747795b65b3e865a2db5bf6d07fbb947f78c2: Status 404 returned error can't find the container with id bd59461d9aca86c5c814b11d3b8747795b65b3e865a2db5bf6d07fbb947f78c2 Apr 22 18:49:15.436931 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.436841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" event={"ID":"f2809948-8435-496d-ae4b-0791482f79c7","Type":"ContainerStarted","Data":"bd59461d9aca86c5c814b11d3b8747795b65b3e865a2db5bf6d07fbb947f78c2"} Apr 22 18:49:15.440128 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.440095 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434"} Apr 22 18:49:15.440279 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.440131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142"} Apr 22 18:49:15.440279 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.440147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7"} Apr 22 18:49:15.440279 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.440161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7"} Apr 22 18:49:15.440279 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.440175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011"} Apr 22 18:49:15.441790 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.441768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" event={"ID":"5443706c-f0f0-4036-8c01-0faa5b4b7f57","Type":"ContainerStarted","Data":"d6758c665644f7147e3b7c7781d85e013ab3229436070b6c9f521b3b493b754d"} Apr 22 18:49:15.458092 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.458033 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" podStartSLOduration=0.966887454 podStartE2EDuration="2.458017702s" podCreationTimestamp="2026-04-22 18:49:13 +0000 UTC" firstStartedPulling="2026-04-22 18:49:13.692739374 +0000 UTC m=+141.217402904" lastFinishedPulling="2026-04-22 18:49:15.183869623 +0000 UTC m=+142.708533152" observedRunningTime="2026-04-22 18:49:15.45658678 +0000 UTC m=+142.981250332" watchObservedRunningTime="2026-04-22 18:49:15.458017702 +0000 UTC m=+142.982681257" Apr 22 18:49:15.986636 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:15.986602 2577 scope.go:117] "RemoveContainer" containerID="01fc3b860886a61fe9e6cfce2f6c9cd3069a5f4a8d3a4a22ab404f174d249e42" Apr 22 18:49:16.446583 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:16.446553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:49:16.447016 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:16.446682 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" event={"ID":"407ba526-67b3-4fe5-9bc6-2c9894fb034f","Type":"ContainerStarted","Data":"49071295716d73dffc08d3e9fb50de536277c7263e1566a43a1a127caef96049"} Apr 22 18:49:16.447346 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:16.447327 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:49:16.450326 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:16.450299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerStarted","Data":"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db"} Apr 22 18:49:16.464153 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:16.464104 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podStartSLOduration=43.151477688 podStartE2EDuration="45.46408792s" podCreationTimestamp="2026-04-22 18:48:31 +0000 UTC" firstStartedPulling="2026-04-22 18:48:31.93198923 +0000 UTC m=+99.456652759" lastFinishedPulling="2026-04-22 18:48:34.244599447 +0000 UTC m=+101.769262991" observedRunningTime="2026-04-22 18:49:16.463007274 +0000 UTC m=+143.987670850" watchObservedRunningTime="2026-04-22 18:49:16.46408792 +0000 UTC m=+143.988751470" Apr 22 18:49:16.486585 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:16.486506 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.9490257829999997 podStartE2EDuration="7.486489172s" podCreationTimestamp="2026-04-22 18:49:09 +0000 UTC" firstStartedPulling="2026-04-22 18:49:11.33894334 +0000 UTC m=+138.863606886" lastFinishedPulling="2026-04-22 18:49:15.876406746 +0000 UTC m=+143.401070275" observedRunningTime="2026-04-22 18:49:16.485407285 +0000 UTC m=+144.010070837" watchObservedRunningTime="2026-04-22 18:49:16.486489172 +0000 UTC m=+144.011152726" Apr 22 18:49:17.448044 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.447998 2577 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-l4kjm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 22 18:49:17.448457 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.448067 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" podUID="407ba526-67b3-4fe5-9bc6-2c9894fb034f" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 22 18:49:17.455133 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.455098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" event={"ID":"f2809948-8435-496d-ae4b-0791482f79c7","Type":"ContainerStarted","Data":"f5f675c186cc4e5674f2b10ccd97204a9a7e02fbfdf12cfa64139387f158724b"} Apr 22 18:49:17.455133 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.455137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" event={"ID":"f2809948-8435-496d-ae4b-0791482f79c7","Type":"ContainerStarted","Data":"58227dc67a7879137885ccd8b1b225720f8dbfb0e6073fb3c6f5612de0ffa1ca"} Apr 22 18:49:17.455354 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.455147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" event={"ID":"f2809948-8435-496d-ae4b-0791482f79c7","Type":"ContainerStarted","Data":"3dfab193a4f6c41faf1fb7d2c055b5b4b425b56eef8f0d658083cf70f294f21d"} Apr 22 18:49:17.475681 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.475625 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6d58b4f4bd-fgwqr" podStartSLOduration=1.580691375 podStartE2EDuration="3.47561158s" podCreationTimestamp="2026-04-22 18:49:14 +0000 UTC" firstStartedPulling="2026-04-22 18:49:14.702891226 +0000 UTC m=+142.227554770" lastFinishedPulling="2026-04-22 18:49:16.597811432 +0000 UTC m=+144.122474975" observedRunningTime="2026-04-22 18:49:17.4754058 +0000 UTC m=+145.000069367" watchObservedRunningTime="2026-04-22 18:49:17.47561158 +0000 UTC m=+145.000275132" Apr 22 18:49:17.775902 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.775872 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4kjm" Apr 22 18:49:17.958311 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.958262 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-kj98m"] Apr 22 18:49:17.960316 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.960263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:17.962900 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.962876 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:49:17.963016 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.962912 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bnn4s\"" Apr 22 18:49:17.963016 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.963004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:49:17.969227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:17.969206 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kj98m"] Apr 22 18:49:18.061711 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:18.061619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9bqj\" (UniqueName: \"kubernetes.io/projected/863935b8-e8a7-4f32-aa10-ab772cc335b8-kube-api-access-w9bqj\") pod \"downloads-6bcc868b7-kj98m\" (UID: \"863935b8-e8a7-4f32-aa10-ab772cc335b8\") " pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:18.162805 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:18.162768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9bqj\" (UniqueName: \"kubernetes.io/projected/863935b8-e8a7-4f32-aa10-ab772cc335b8-kube-api-access-w9bqj\") pod \"downloads-6bcc868b7-kj98m\" (UID: \"863935b8-e8a7-4f32-aa10-ab772cc335b8\") " pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:18.171068 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:18.171040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9bqj\" (UniqueName: \"kubernetes.io/projected/863935b8-e8a7-4f32-aa10-ab772cc335b8-kube-api-access-w9bqj\") pod \"downloads-6bcc868b7-kj98m\" (UID: \"863935b8-e8a7-4f32-aa10-ab772cc335b8\") " pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:18.270130 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:18.270095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:18.395037 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:18.394972 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kj98m"] Apr 22 18:49:18.399344 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:18.399309 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863935b8_e8a7_4f32_aa10_ab772cc335b8.slice/crio-edced3ab0478fc96856227af76632fa683bdaf767690015722e5eab3392b01d7 WatchSource:0}: Error finding container edced3ab0478fc96856227af76632fa683bdaf767690015722e5eab3392b01d7: Status 404 returned error can't find the container with id edced3ab0478fc96856227af76632fa683bdaf767690015722e5eab3392b01d7 Apr 22 18:49:18.458332 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:18.458297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kj98m" event={"ID":"863935b8-e8a7-4f32-aa10-ab772cc335b8","Type":"ContainerStarted","Data":"edced3ab0478fc96856227af76632fa683bdaf767690015722e5eab3392b01d7"} Apr 22 18:49:22.371281 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:22.371231 2577 patch_prober.go:28] interesting pod/image-registry-597d994cdc-cvzrg container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:22.371691 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:22.371302 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" podUID="00ddfe65-9d38-4b62-a7fc-877af5eec212" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:24.387574 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:24.387543 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-597d994cdc-cvzrg" Apr 22 18:49:27.774334 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.774297 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9c6b7476c-7tzvd"] Apr 22 18:49:27.779317 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.779294 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.781919 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.781890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:49:27.783218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.783070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:49:27.783218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.783082 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:49:27.783218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.783102 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:49:27.783218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.783108 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:49:27.783218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.783202 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-btbdp\"" Apr 22 18:49:27.788591 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.788567 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c6b7476c-7tzvd"] Apr 22 18:49:27.857436 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.857397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsdb\" (UniqueName: \"kubernetes.io/projected/da3c4d1e-0025-4abf-91de-a8c230d0eb58-kube-api-access-rtsdb\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.857639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.857453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-serving-cert\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.857639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.857580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-service-ca\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.857639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.857615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-config\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.857788 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.857671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-oauth-config\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.857788 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.857706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-oauth-serving-cert\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.958481 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.958445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-service-ca\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.958678 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.958494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-config\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.958678 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.958551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-oauth-config\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.958678 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.958585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-oauth-serving-cert\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.958678 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.958664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsdb\" (UniqueName: \"kubernetes.io/projected/da3c4d1e-0025-4abf-91de-a8c230d0eb58-kube-api-access-rtsdb\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.958896 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.958697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-serving-cert\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.959260 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.959229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-service-ca\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.959410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.959239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-config\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.959410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.959358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-oauth-serving-cert\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.961358 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.961333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-oauth-config\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.961546 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.961524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-serving-cert\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:27.966551 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:27.966532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsdb\" (UniqueName: \"kubernetes.io/projected/da3c4d1e-0025-4abf-91de-a8c230d0eb58-kube-api-access-rtsdb\") pod \"console-9c6b7476c-7tzvd\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:28.090582 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:28.090499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:29.371113 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:49:29.371062 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qnb4b" podUID="be593d71-f465-4468-8034-246bf4c51e73" Apr 22 18:49:29.393316 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:49:29.393256 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-68s2k" podUID="130369e7-d304-4500-9ad6-18b8f2f4e731" Apr 22 18:49:29.493534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:29.493494 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:49:29.493705 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:29.493680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-68s2k" Apr 22 18:49:33.566592 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:33.566551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:33.567055 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:33.566608 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:33.852732 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:33.852710 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c6b7476c-7tzvd"] Apr 22 18:49:33.862745 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:33.862717 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3c4d1e_0025_4abf_91de_a8c230d0eb58.slice/crio-cee72413cab6c4d007ff0443b64e1a9dd822ce4cd675a3970a0bf396981e2702 WatchSource:0}: Error finding container cee72413cab6c4d007ff0443b64e1a9dd822ce4cd675a3970a0bf396981e2702: Status 404 returned error can't find the container with id cee72413cab6c4d007ff0443b64e1a9dd822ce4cd675a3970a0bf396981e2702 Apr 22 18:49:34.321766 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.321735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:49:34.321968 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.321840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:49:34.324552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.324523 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/130369e7-d304-4500-9ad6-18b8f2f4e731-metrics-tls\") pod \"dns-default-68s2k\" (UID: \"130369e7-d304-4500-9ad6-18b8f2f4e731\") " pod="openshift-dns/dns-default-68s2k" Apr 22 18:49:34.324733 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.324705 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be593d71-f465-4468-8034-246bf4c51e73-cert\") pod \"ingress-canary-qnb4b\" (UID: \"be593d71-f465-4468-8034-246bf4c51e73\") " pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:49:34.511338 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.511215 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kj98m" event={"ID":"863935b8-e8a7-4f32-aa10-ab772cc335b8","Type":"ContainerStarted","Data":"81989e25b904d27b5141da3cc29524cb67460fe79f210a71d08f5499aec93a8e"} Apr 22 18:49:34.511338 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.511305 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:34.512732 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.512696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c6b7476c-7tzvd" event={"ID":"da3c4d1e-0025-4abf-91de-a8c230d0eb58","Type":"ContainerStarted","Data":"cee72413cab6c4d007ff0443b64e1a9dd822ce4cd675a3970a0bf396981e2702"} Apr 22 18:49:34.527593 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.527561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-kj98m" Apr 22 18:49:34.531748 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.531653 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-kj98m" podStartSLOduration=2.107157028 podStartE2EDuration="17.531635485s" podCreationTimestamp="2026-04-22 18:49:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:18.401454207 +0000 UTC m=+145.926117739" lastFinishedPulling="2026-04-22 18:49:33.825932665 +0000 UTC m=+161.350596196" observedRunningTime="2026-04-22 18:49:34.529250957 +0000 UTC m=+162.053914510" watchObservedRunningTime="2026-04-22 18:49:34.531635485 +0000 UTC m=+162.056299038" Apr 22 18:49:34.596940 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.596844 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdfh7\"" Apr 22 18:49:34.597531 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.597132 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctxn4\"" Apr 22 18:49:34.604567 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.604534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnb4b" Apr 22 18:49:34.605036 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.605017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-68s2k" Apr 22 18:49:34.772337 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.772228 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnb4b"] Apr 22 18:49:34.777192 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:34.777147 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe593d71_f465_4468_8034_246bf4c51e73.slice/crio-3d4f938ca263d8d4ec962b4ae29983034bf8cadc3c8634440e80bb6bd7affb44 WatchSource:0}: Error finding container 3d4f938ca263d8d4ec962b4ae29983034bf8cadc3c8634440e80bb6bd7affb44: Status 404 returned error can't find the container with id 3d4f938ca263d8d4ec962b4ae29983034bf8cadc3c8634440e80bb6bd7affb44 Apr 22 18:49:34.800176 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:34.800126 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-68s2k"] Apr 22 18:49:34.803773 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:34.803740 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod130369e7_d304_4500_9ad6_18b8f2f4e731.slice/crio-2b2eb73262b2081c4c194c80f7ca0349c7cdc28f8999602da3247cca362a4884 WatchSource:0}: Error finding container 2b2eb73262b2081c4c194c80f7ca0349c7cdc28f8999602da3247cca362a4884: Status 404 returned error can't find the container with id 2b2eb73262b2081c4c194c80f7ca0349c7cdc28f8999602da3247cca362a4884 Apr 22 18:49:35.520651 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:35.520592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-68s2k" event={"ID":"130369e7-d304-4500-9ad6-18b8f2f4e731","Type":"ContainerStarted","Data":"2b2eb73262b2081c4c194c80f7ca0349c7cdc28f8999602da3247cca362a4884"} Apr 22 18:49:35.523248 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:35.523199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnb4b" event={"ID":"be593d71-f465-4468-8034-246bf4c51e73","Type":"ContainerStarted","Data":"3d4f938ca263d8d4ec962b4ae29983034bf8cadc3c8634440e80bb6bd7affb44"} Apr 22 18:49:36.456471 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.456429 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-655d7c87f4-6lndf"] Apr 22 18:49:36.469140 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.469095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.477029 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.476977 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-655d7c87f4-6lndf"] Apr 22 18:49:36.481612 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.480457 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:49:36.648195 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-serving-cert\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.648367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-oauth-serving-cert\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.648367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-config\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.648367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-trusted-ca-bundle\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.648367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-oauth-config\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.648593 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft7d6\" (UniqueName: \"kubernetes.io/projected/c2b57e2f-312a-4e24-92d6-8210bfb5d014-kube-api-access-ft7d6\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.648593 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.648420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-service-ca\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.749595 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.749470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-serving-cert\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.749595 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.749512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-oauth-serving-cert\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.749823 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.749643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-config\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.749823 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.749688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-trusted-ca-bundle\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.750040 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.749953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-oauth-config\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.750040 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.749996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft7d6\" (UniqueName: \"kubernetes.io/projected/c2b57e2f-312a-4e24-92d6-8210bfb5d014-kube-api-access-ft7d6\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.750188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.750049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-service-ca\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.750556 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.750535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-oauth-serving-cert\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.750664 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.750583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-config\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.750916 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.750888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-trusted-ca-bundle\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.751155 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.750942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-service-ca\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.752887 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.752863 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-serving-cert\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.753313 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.753264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-oauth-config\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.758060 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.758035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft7d6\" (UniqueName: \"kubernetes.io/projected/c2b57e2f-312a-4e24-92d6-8210bfb5d014-kube-api-access-ft7d6\") pod \"console-655d7c87f4-6lndf\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:36.786252 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:36.786212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:39.300250 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.300218 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-655d7c87f4-6lndf"] Apr 22 18:49:39.301877 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:49:39.301845 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b57e2f_312a_4e24_92d6_8210bfb5d014.slice/crio-59a8a980e4dd986fab25f12e4c7788e6f33256804e436e9353a226d26b955f32 WatchSource:0}: Error finding container 59a8a980e4dd986fab25f12e4c7788e6f33256804e436e9353a226d26b955f32: Status 404 returned error can't find the container with id 59a8a980e4dd986fab25f12e4c7788e6f33256804e436e9353a226d26b955f32 Apr 22 18:49:39.537863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.537825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c6b7476c-7tzvd" event={"ID":"da3c4d1e-0025-4abf-91de-a8c230d0eb58","Type":"ContainerStarted","Data":"7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69"} Apr 22 18:49:39.539676 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.539623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-68s2k" event={"ID":"130369e7-d304-4500-9ad6-18b8f2f4e731","Type":"ContainerStarted","Data":"d18e35b19de9201c709677d1fe65fd06ff4ba30640d5d75bf407402870c3d95a"} Apr 22 18:49:39.541700 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.541295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655d7c87f4-6lndf" event={"ID":"c2b57e2f-312a-4e24-92d6-8210bfb5d014","Type":"ContainerStarted","Data":"af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be"} Apr 22 18:49:39.541700 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.541327 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655d7c87f4-6lndf" event={"ID":"c2b57e2f-312a-4e24-92d6-8210bfb5d014","Type":"ContainerStarted","Data":"59a8a980e4dd986fab25f12e4c7788e6f33256804e436e9353a226d26b955f32"} Apr 22 18:49:39.543950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.543913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnb4b" event={"ID":"be593d71-f465-4468-8034-246bf4c51e73","Type":"ContainerStarted","Data":"951f0e3e121d74c2ceef0922bbd3cbab3ec7be3cd6876eed0fca023489ca95f4"} Apr 22 18:49:39.561247 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.560556 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9c6b7476c-7tzvd" podStartSLOduration=7.275653913 podStartE2EDuration="12.560540401s" podCreationTimestamp="2026-04-22 18:49:27 +0000 UTC" firstStartedPulling="2026-04-22 18:49:33.864789343 +0000 UTC m=+161.389452876" lastFinishedPulling="2026-04-22 18:49:39.149675832 +0000 UTC m=+166.674339364" observedRunningTime="2026-04-22 18:49:39.55856338 +0000 UTC m=+167.083226944" watchObservedRunningTime="2026-04-22 18:49:39.560540401 +0000 UTC m=+167.085203951" Apr 22 18:49:39.579257 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.579211 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-655d7c87f4-6lndf" podStartSLOduration=3.579196048 podStartE2EDuration="3.579196048s" podCreationTimestamp="2026-04-22 18:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:39.576716244 +0000 UTC m=+167.101379796" watchObservedRunningTime="2026-04-22 18:49:39.579196048 +0000 UTC m=+167.103859599" Apr 22 18:49:39.594103 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:39.594046 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qnb4b" podStartSLOduration=129.227889337 podStartE2EDuration="2m13.594015339s" podCreationTimestamp="2026-04-22 18:47:26 +0000 UTC" firstStartedPulling="2026-04-22 18:49:34.780348354 +0000 UTC m=+162.305011891" lastFinishedPulling="2026-04-22 18:49:39.14647435 +0000 UTC m=+166.671137893" observedRunningTime="2026-04-22 18:49:39.593197881 +0000 UTC m=+167.117861446" watchObservedRunningTime="2026-04-22 18:49:39.594015339 +0000 UTC m=+167.118678891" Apr 22 18:49:40.548962 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:40.548923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-68s2k" event={"ID":"130369e7-d304-4500-9ad6-18b8f2f4e731","Type":"ContainerStarted","Data":"a2e52c875448a8fa275dce24605e9c88b28d5516e10c054300735d15a34d440f"} Apr 22 18:49:40.549510 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:40.549494 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-68s2k" Apr 22 18:49:40.568960 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:40.568908 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-68s2k" podStartSLOduration=130.227949646 podStartE2EDuration="2m14.568894486s" podCreationTimestamp="2026-04-22 18:47:26 +0000 UTC" firstStartedPulling="2026-04-22 18:49:34.806602266 +0000 UTC m=+162.331265794" lastFinishedPulling="2026-04-22 18:49:39.147547102 +0000 UTC m=+166.672210634" observedRunningTime="2026-04-22 18:49:40.567239687 +0000 UTC m=+168.091903240" watchObservedRunningTime="2026-04-22 18:49:40.568894486 +0000 UTC m=+168.093558037" Apr 22 18:49:46.786502 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:46.786469 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:46.786957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:46.786531 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:46.791171 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:46.791146 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:47.575057 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:47.575030 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:49:47.624092 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:47.624038 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9c6b7476c-7tzvd"] Apr 22 18:49:48.091632 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:48.091603 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:49:51.558196 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:51.558161 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-68s2k" Apr 22 18:49:53.571993 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:53.571961 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:49:53.575922 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:49:53.575900 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5bc5566b49-nwt7t" Apr 22 18:50:01.622413 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:01.622378 2577 generic.go:358] "Generic (PLEG): container finished" podID="20a2b785-7a65-4033-ae4d-0275a248aec8" containerID="eaa6accaa14ae9b7ec2db273cfb303fae3ccc92271090875d27453ea01e63cd3" exitCode=0 Apr 22 18:50:01.622803 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:01.622443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" event={"ID":"20a2b785-7a65-4033-ae4d-0275a248aec8","Type":"ContainerDied","Data":"eaa6accaa14ae9b7ec2db273cfb303fae3ccc92271090875d27453ea01e63cd3"} Apr 22 18:50:01.622803 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:01.622748 2577 scope.go:117] "RemoveContainer" containerID="eaa6accaa14ae9b7ec2db273cfb303fae3ccc92271090875d27453ea01e63cd3" Apr 22 18:50:02.627681 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:02.627645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gkbcc" event={"ID":"20a2b785-7a65-4033-ae4d-0275a248aec8","Type":"ContainerStarted","Data":"5583c5ab8080a9da306ac5315155a11f00b10af630872ee640542b01427f926b"} Apr 22 18:50:12.644171 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:12.644123 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9c6b7476c-7tzvd" podUID="da3c4d1e-0025-4abf-91de-a8c230d0eb58" containerName="console" containerID="cri-o://7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69" gracePeriod=15 Apr 22 18:50:12.920707 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:12.920675 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c6b7476c-7tzvd_da3c4d1e-0025-4abf-91de-a8c230d0eb58/console/0.log" Apr 22 18:50:12.920846 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:12.920750 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:50:13.085818 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.085784 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-oauth-config\") pod \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " Apr 22 18:50:13.085987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.085861 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-serving-cert\") pod \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " Apr 22 18:50:13.085987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.085911 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-config\") pod \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " Apr 22 18:50:13.085987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.085938 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-service-ca\") pod \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " Apr 22 18:50:13.085987 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.085980 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtsdb\" (UniqueName: \"kubernetes.io/projected/da3c4d1e-0025-4abf-91de-a8c230d0eb58-kube-api-access-rtsdb\") pod \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " Apr 22 18:50:13.086184 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.086002 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-oauth-serving-cert\") pod \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\" (UID: \"da3c4d1e-0025-4abf-91de-a8c230d0eb58\") " Apr 22 18:50:13.086432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.086321 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-config" (OuterVolumeSpecName: "console-config") pod "da3c4d1e-0025-4abf-91de-a8c230d0eb58" (UID: "da3c4d1e-0025-4abf-91de-a8c230d0eb58"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:13.086544 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.086444 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-service-ca" (OuterVolumeSpecName: "service-ca") pod "da3c4d1e-0025-4abf-91de-a8c230d0eb58" (UID: "da3c4d1e-0025-4abf-91de-a8c230d0eb58"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:13.086614 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.086578 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "da3c4d1e-0025-4abf-91de-a8c230d0eb58" (UID: "da3c4d1e-0025-4abf-91de-a8c230d0eb58"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:13.088136 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.088111 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "da3c4d1e-0025-4abf-91de-a8c230d0eb58" (UID: "da3c4d1e-0025-4abf-91de-a8c230d0eb58"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:13.088557 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.088530 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3c4d1e-0025-4abf-91de-a8c230d0eb58-kube-api-access-rtsdb" (OuterVolumeSpecName: "kube-api-access-rtsdb") pod "da3c4d1e-0025-4abf-91de-a8c230d0eb58" (UID: "da3c4d1e-0025-4abf-91de-a8c230d0eb58"). InnerVolumeSpecName "kube-api-access-rtsdb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:13.088651 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.088535 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "da3c4d1e-0025-4abf-91de-a8c230d0eb58" (UID: "da3c4d1e-0025-4abf-91de-a8c230d0eb58"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:13.186901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.186819 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-serving-cert\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.186901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.186848 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.186901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.186861 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-service-ca\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.186901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.186873 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtsdb\" (UniqueName: \"kubernetes.io/projected/da3c4d1e-0025-4abf-91de-a8c230d0eb58-kube-api-access-rtsdb\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.186901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.186885 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da3c4d1e-0025-4abf-91de-a8c230d0eb58-oauth-serving-cert\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.186901 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.186897 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da3c4d1e-0025-4abf-91de-a8c230d0eb58-console-oauth-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.664359 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.664329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c6b7476c-7tzvd_da3c4d1e-0025-4abf-91de-a8c230d0eb58/console/0.log" Apr 22 18:50:13.664761 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.664373 2577 generic.go:358] "Generic (PLEG): container finished" podID="da3c4d1e-0025-4abf-91de-a8c230d0eb58" containerID="7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69" exitCode=2 Apr 22 18:50:13.664761 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.664439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c6b7476c-7tzvd" event={"ID":"da3c4d1e-0025-4abf-91de-a8c230d0eb58","Type":"ContainerDied","Data":"7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69"} Apr 22 18:50:13.664761 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.664458 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c6b7476c-7tzvd" Apr 22 18:50:13.664761 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.664481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c6b7476c-7tzvd" event={"ID":"da3c4d1e-0025-4abf-91de-a8c230d0eb58","Type":"ContainerDied","Data":"cee72413cab6c4d007ff0443b64e1a9dd822ce4cd675a3970a0bf396981e2702"} Apr 22 18:50:13.664761 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.664499 2577 scope.go:117] "RemoveContainer" containerID="7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69" Apr 22 18:50:13.673015 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.672997 2577 scope.go:117] "RemoveContainer" containerID="7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69" Apr 22 18:50:13.673327 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:13.673306 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69\": container with ID starting with 7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69 not found: ID does not exist" containerID="7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69" Apr 22 18:50:13.673387 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.673336 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69"} err="failed to get container status \"7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69\": rpc error: code = NotFound desc = could not find container \"7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69\": container with ID starting with 7fa0e5959f5475c448b2043147e170aa1c110c05d88bf207d5cdbb74b37c9e69 not found: ID does not exist" Apr 22 18:50:13.685341 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.685317 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9c6b7476c-7tzvd"] Apr 22 18:50:13.688155 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:13.688130 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9c6b7476c-7tzvd"] Apr 22 18:50:14.990713 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:14.990678 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3c4d1e-0025-4abf-91de-a8c230d0eb58" path="/var/lib/kubelet/pods/da3c4d1e-0025-4abf-91de-a8c230d0eb58/volumes" Apr 22 18:50:27.525750 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.525703 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dbb8f7464-g9lp7"] Apr 22 18:50:27.527083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.527058 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3c4d1e-0025-4abf-91de-a8c230d0eb58" containerName="console" Apr 22 18:50:27.527255 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.527244 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c4d1e-0025-4abf-91de-a8c230d0eb58" containerName="console" Apr 22 18:50:27.527571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.527547 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="da3c4d1e-0025-4abf-91de-a8c230d0eb58" containerName="console" Apr 22 18:50:27.534814 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.534784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.535452 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.535419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbb8f7464-g9lp7"] Apr 22 18:50:27.606914 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.606879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh24h\" (UniqueName: \"kubernetes.io/projected/0474d268-addc-4b52-8abe-d8ed8a4284d3-kube-api-access-qh24h\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.607072 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.606937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-oauth-config\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.607072 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.606979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-service-ca\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.607072 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.607061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-trusted-ca-bundle\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.607188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.607096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-config\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.607188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.607143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-serving-cert\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.607188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.607177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-oauth-serving-cert\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.707909 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.707876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh24h\" (UniqueName: \"kubernetes.io/projected/0474d268-addc-4b52-8abe-d8ed8a4284d3-kube-api-access-qh24h\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.707909 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.707916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-oauth-config\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.708182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.707938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-service-ca\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.708182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.707978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-trusted-ca-bundle\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.708182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-config\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.708182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-serving-cert\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.708182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-oauth-serving-cert\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.708914 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-oauth-serving-cert\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.709052 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-config\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.709052 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-trusted-ca-bundle\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.709052 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.708969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-service-ca\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.710482 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.710460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-oauth-config\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.710609 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.710591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-serving-cert\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.715583 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.715562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh24h\" (UniqueName: \"kubernetes.io/projected/0474d268-addc-4b52-8abe-d8ed8a4284d3-kube-api-access-qh24h\") pod \"console-5dbb8f7464-g9lp7\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.845935 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.845844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:27.965224 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:27.965200 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbb8f7464-g9lp7"] Apr 22 18:50:27.967376 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:50:27.967344 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0474d268_addc_4b52_8abe_d8ed8a4284d3.slice/crio-d022e9ecb005c8d761ea46ffa2fb485751c7502d0a1a3024532c550d2cf01c7f WatchSource:0}: Error finding container d022e9ecb005c8d761ea46ffa2fb485751c7502d0a1a3024532c550d2cf01c7f: Status 404 returned error can't find the container with id d022e9ecb005c8d761ea46ffa2fb485751c7502d0a1a3024532c550d2cf01c7f Apr 22 18:50:28.708795 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:28.708762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbb8f7464-g9lp7" event={"ID":"0474d268-addc-4b52-8abe-d8ed8a4284d3","Type":"ContainerStarted","Data":"e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b"} Apr 22 18:50:28.708795 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:28.708798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbb8f7464-g9lp7" event={"ID":"0474d268-addc-4b52-8abe-d8ed8a4284d3","Type":"ContainerStarted","Data":"d022e9ecb005c8d761ea46ffa2fb485751c7502d0a1a3024532c550d2cf01c7f"} Apr 22 18:50:28.726480 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:28.726435 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dbb8f7464-g9lp7" podStartSLOduration=1.7264217990000001 podStartE2EDuration="1.726421799s" podCreationTimestamp="2026-04-22 18:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:28.724762474 +0000 UTC m=+216.249426026" watchObservedRunningTime="2026-04-22 18:50:28.726421799 +0000 UTC m=+216.251085349" Apr 22 18:50:29.257172 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257135 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:29.257594 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257571 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="alertmanager" containerID="cri-o://5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011" gracePeriod=120 Apr 22 18:50:29.257723 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257651 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-metric" containerID="cri-o://a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434" gracePeriod=120 Apr 22 18:50:29.257723 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257712 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy" containerID="cri-o://11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142" gracePeriod=120 Apr 22 18:50:29.257861 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257742 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="prom-label-proxy" containerID="cri-o://b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db" gracePeriod=120 Apr 22 18:50:29.257861 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257656 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-web" containerID="cri-o://0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7" gracePeriod=120 Apr 22 18:50:29.257861 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.257676 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="config-reloader" containerID="cri-o://fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7" gracePeriod=120 Apr 22 18:50:29.714395 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714366 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db" exitCode=0 Apr 22 18:50:29.714395 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714392 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142" exitCode=0 Apr 22 18:50:29.714762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714412 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7" exitCode=0 Apr 22 18:50:29.714762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714420 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011" exitCode=0 Apr 22 18:50:29.714762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db"} Apr 22 18:50:29.714762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714477 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142"} Apr 22 18:50:29.714762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714499 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7"} Apr 22 18:50:29.714762 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:29.714512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011"} Apr 22 18:50:30.504458 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.504436 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.533801 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-tls-assets\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.533857 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.533888 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-main-db\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.533914 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-web-config\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.533984 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-config-volume\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534024 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdxvr\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-kube-api-access-fdxvr\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534066 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-web\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534090 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-metrics-client-ca\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534121 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534144 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-cluster-tls-config\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534178 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534846 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534224 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-main-tls\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.534846 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534249 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-config-out\") pod \"c03cef71-3c3e-493f-b282-90922361220a\" (UID: \"c03cef71-3c3e-493f-b282-90922361220a\") " Apr 22 18:50:30.535180 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.534936 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:30.535957 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.535665 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:30.536865 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.536823 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:30.539899 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.537944 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:30.541053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.541017 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-config-out" (OuterVolumeSpecName: "config-out") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:30.541053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.541030 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.541187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.541119 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.541187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.541136 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.541187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.541164 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.541595 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.541567 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-config-volume" (OuterVolumeSpecName: "config-volume") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.542464 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.542438 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-kube-api-access-fdxvr" (OuterVolumeSpecName: "kube-api-access-fdxvr") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "kube-api-access-fdxvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:30.547789 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.547759 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.554502 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.554469 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-web-config" (OuterVolumeSpecName: "web-config") pod "c03cef71-3c3e-493f-b282-90922361220a" (UID: "c03cef71-3c3e-493f-b282-90922361220a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635106 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdxvr\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-kube-api-access-fdxvr\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635132 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635144 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-metrics-client-ca\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635153 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635162 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-cluster-tls-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635171 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635180 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-main-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635188 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-config-out\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635532 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635198 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c03cef71-3c3e-493f-b282-90922361220a-tls-assets\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635532 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635206 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635532 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635215 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c03cef71-3c3e-493f-b282-90922361220a-alertmanager-main-db\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635532 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635224 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-web-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.635532 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.635232 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c03cef71-3c3e-493f-b282-90922361220a-config-volume\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:50:30.720758 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720722 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434" exitCode=0 Apr 22 18:50:30.720758 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720748 2577 generic.go:358] "Generic (PLEG): container finished" podID="c03cef71-3c3e-493f-b282-90922361220a" containerID="0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7" exitCode=0 Apr 22 18:50:30.721187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434"} Apr 22 18:50:30.721187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720825 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.721187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7"} Apr 22 18:50:30.721187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c03cef71-3c3e-493f-b282-90922361220a","Type":"ContainerDied","Data":"2ce2b3afab10c1e2be3c9ed68edfdba7027cf1cd267b7b4407cf127ed616393f"} Apr 22 18:50:30.721187 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.720866 2577 scope.go:117] "RemoveContainer" containerID="b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db" Apr 22 18:50:30.728913 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.728891 2577 scope.go:117] "RemoveContainer" containerID="a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434" Apr 22 18:50:30.735577 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.735560 2577 scope.go:117] "RemoveContainer" containerID="11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142" Apr 22 18:50:30.742172 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.742115 2577 scope.go:117] "RemoveContainer" containerID="0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7" Apr 22 18:50:30.744133 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.744111 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:30.749595 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.749570 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:30.754922 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.754892 2577 scope.go:117] "RemoveContainer" containerID="fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7" Apr 22 18:50:30.764254 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.764232 2577 scope.go:117] "RemoveContainer" containerID="5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011" Apr 22 18:50:30.770744 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.770719 2577 scope.go:117] "RemoveContainer" containerID="b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393" Apr 22 18:50:30.774012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.773989 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:30.774406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774387 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-metric" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774410 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-metric" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774423 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="prom-label-proxy" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774431 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="prom-label-proxy" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774441 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="config-reloader" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774448 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="config-reloader" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774461 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="init-config-reloader" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774469 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="init-config-reloader" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774487 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="alertmanager" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774495 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="alertmanager" Apr 22 18:50:30.774501 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774505 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774513 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774525 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-web" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774533 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-web" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774623 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="config-reloader" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774634 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774648 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-web" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774659 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="alertmanager" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774671 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="kube-rbac-proxy-metric" Apr 22 18:50:30.774958 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.774681 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03cef71-3c3e-493f-b282-90922361220a" containerName="prom-label-proxy" Apr 22 18:50:30.777822 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.777807 2577 scope.go:117] "RemoveContainer" containerID="b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db" Apr 22 18:50:30.778046 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.778026 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db\": container with ID starting with b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db not found: ID does not exist" containerID="b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db" Apr 22 18:50:30.778091 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778055 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db"} err="failed to get container status \"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db\": rpc error: code = NotFound desc = could not find container \"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db\": container with ID starting with b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db not found: ID does not exist" Apr 22 18:50:30.778091 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778073 2577 scope.go:117] "RemoveContainer" containerID="a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434" Apr 22 18:50:30.778286 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.778253 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434\": container with ID starting with a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434 not found: ID does not exist" containerID="a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434" Apr 22 18:50:30.778325 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778288 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434"} err="failed to get container status \"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434\": rpc error: code = NotFound desc = could not find container \"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434\": container with ID starting with a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434 not found: ID does not exist" Apr 22 18:50:30.778325 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778303 2577 scope.go:117] "RemoveContainer" containerID="11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142" Apr 22 18:50:30.778514 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.778498 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142\": container with ID starting with 11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142 not found: ID does not exist" containerID="11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142" Apr 22 18:50:30.778553 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778519 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142"} err="failed to get container status \"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142\": rpc error: code = NotFound desc = could not find container \"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142\": container with ID starting with 11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142 not found: ID does not exist" Apr 22 18:50:30.778553 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778531 2577 scope.go:117] "RemoveContainer" containerID="0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7" Apr 22 18:50:30.778748 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.778734 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7\": container with ID starting with 0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7 not found: ID does not exist" containerID="0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7" Apr 22 18:50:30.778786 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778752 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7"} err="failed to get container status \"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7\": rpc error: code = NotFound desc = could not find container \"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7\": container with ID starting with 0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7 not found: ID does not exist" Apr 22 18:50:30.778786 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778763 2577 scope.go:117] "RemoveContainer" containerID="fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7" Apr 22 18:50:30.778968 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.778953 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7\": container with ID starting with fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7 not found: ID does not exist" containerID="fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7" Apr 22 18:50:30.779015 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778969 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7"} err="failed to get container status \"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7\": rpc error: code = NotFound desc = could not find container \"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7\": container with ID starting with fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7 not found: ID does not exist" Apr 22 18:50:30.779015 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.778980 2577 scope.go:117] "RemoveContainer" containerID="5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011" Apr 22 18:50:30.779158 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.779144 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011\": container with ID starting with 5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011 not found: ID does not exist" containerID="5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011" Apr 22 18:50:30.779192 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779161 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011"} err="failed to get container status \"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011\": rpc error: code = NotFound desc = could not find container \"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011\": container with ID starting with 5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011 not found: ID does not exist" Apr 22 18:50:30.779192 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779172 2577 scope.go:117] "RemoveContainer" containerID="b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393" Apr 22 18:50:30.779387 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:50:30.779370 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393\": container with ID starting with b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393 not found: ID does not exist" containerID="b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393" Apr 22 18:50:30.779441 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779392 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393"} err="failed to get container status \"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393\": rpc error: code = NotFound desc = could not find container \"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393\": container with ID starting with b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393 not found: ID does not exist" Apr 22 18:50:30.779441 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779407 2577 scope.go:117] "RemoveContainer" containerID="b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db" Apr 22 18:50:30.779605 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779588 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db"} err="failed to get container status \"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db\": rpc error: code = NotFound desc = could not find container \"b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db\": container with ID starting with b5c176250e78b1b3f09a7a2020b44cdeb0f7850ccb59e87fedf718da65fc34db not found: ID does not exist" Apr 22 18:50:30.779654 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779606 2577 scope.go:117] "RemoveContainer" containerID="a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434" Apr 22 18:50:30.779779 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779763 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434"} err="failed to get container status \"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434\": rpc error: code = NotFound desc = could not find container \"a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434\": container with ID starting with a10a0e9fce28b8c85cf6deb57d31cad66a0f9370793ed87348b5f5978be93434 not found: ID does not exist" Apr 22 18:50:30.779828 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779780 2577 scope.go:117] "RemoveContainer" containerID="11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142" Apr 22 18:50:30.779982 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779963 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142"} err="failed to get container status \"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142\": rpc error: code = NotFound desc = could not find container \"11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142\": container with ID starting with 11563e8ebc22e192c4451332ddcc993ad224fb3cb3652de3e5a30bd3506bf142 not found: ID does not exist" Apr 22 18:50:30.780047 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779985 2577 scope.go:117] "RemoveContainer" containerID="0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7" Apr 22 18:50:30.780047 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.779993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.780234 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.780205 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7"} err="failed to get container status \"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7\": rpc error: code = NotFound desc = could not find container \"0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7\": container with ID starting with 0dff2c3e57400e525e372a6869338c8241deafdd64be9c7adcc262b3be445cc7 not found: ID does not exist" Apr 22 18:50:30.780349 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.780236 2577 scope.go:117] "RemoveContainer" containerID="fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7" Apr 22 18:50:30.780612 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.780592 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7"} err="failed to get container status \"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7\": rpc error: code = NotFound desc = could not find container \"fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7\": container with ID starting with fea30f067f918f9ec7f619e635e84164b40262cd96934fe98ececcdff606b4d7 not found: ID does not exist" Apr 22 18:50:30.780612 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.780611 2577 scope.go:117] "RemoveContainer" containerID="5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011" Apr 22 18:50:30.780830 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.780813 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011"} err="failed to get container status \"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011\": rpc error: code = NotFound desc = could not find container \"5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011\": container with ID starting with 5ced0870c4975edf6b5ba0e4704166eb74df0b9e740f9737dbbfb26e949c5011 not found: ID does not exist" Apr 22 18:50:30.780830 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.780829 2577 scope.go:117] "RemoveContainer" containerID="b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393" Apr 22 18:50:30.781032 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.781014 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393"} err="failed to get container status \"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393\": rpc error: code = NotFound desc = could not find container \"b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393\": container with ID starting with b85d1074f0396f798f6cd6bb71fc0e676cef0dac5698002599fe31184b784393 not found: ID does not exist" Apr 22 18:50:30.782529 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782507 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:50:30.782622 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782549 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:50:30.782622 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782519 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:50:30.782756 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782619 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:50:30.782756 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782630 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5tfxw\"" Apr 22 18:50:30.782756 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:50:30.782954 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782900 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:50:30.782954 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.782905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:50:30.783227 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.783210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:50:30.787373 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.787350 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:50:30.789319 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.789296 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:30.836352 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836352 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15d9b432-370b-4e02-af61-f3f0163e829c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gflt\" (UniqueName: \"kubernetes.io/projected/15d9b432-370b-4e02-af61-f3f0163e829c-kube-api-access-2gflt\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15d9b432-370b-4e02-af61-f3f0163e829c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d9b432-370b-4e02-af61-f3f0163e829c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836763 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15d9b432-370b-4e02-af61-f3f0163e829c-config-out\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836763 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836763 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-web-config\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836763 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836763 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-config-volume\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.836906 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.836764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/15d9b432-370b-4e02-af61-f3f0163e829c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937386 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-config-volume\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937386 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/15d9b432-370b-4e02-af61-f3f0163e829c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937386 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937655 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937655 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15d9b432-370b-4e02-af61-f3f0163e829c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937655 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937655 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gflt\" (UniqueName: \"kubernetes.io/projected/15d9b432-370b-4e02-af61-f3f0163e829c-kube-api-access-2gflt\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15d9b432-370b-4e02-af61-f3f0163e829c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d9b432-370b-4e02-af61-f3f0163e829c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15d9b432-370b-4e02-af61-f3f0163e829c-config-out\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/15d9b432-370b-4e02-af61-f3f0163e829c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.937847 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-web-config\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.938128 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.937868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.939213 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.938887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d9b432-370b-4e02-af61-f3f0163e829c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.940143 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.940114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15d9b432-370b-4e02-af61-f3f0163e829c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.940470 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.940452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-config-volume\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.940569 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.940514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15d9b432-370b-4e02-af61-f3f0163e829c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.940608 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.940574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.940650 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.940609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.940777 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.940753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.941024 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.941008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.941108 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.941092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.942404 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.942383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15d9b432-370b-4e02-af61-f3f0163e829c-config-out\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.942582 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.942568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15d9b432-370b-4e02-af61-f3f0163e829c-web-config\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.945960 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.945941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gflt\" (UniqueName: \"kubernetes.io/projected/15d9b432-370b-4e02-af61-f3f0163e829c-kube-api-access-2gflt\") pod \"alertmanager-main-0\" (UID: \"15d9b432-370b-4e02-af61-f3f0163e829c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:30.990377 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:30.990345 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03cef71-3c3e-493f-b282-90922361220a" path="/var/lib/kubelet/pods/c03cef71-3c3e-493f-b282-90922361220a/volumes" Apr 22 18:50:31.090250 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:31.090207 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:31.219675 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:31.219648 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:31.221489 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:50:31.221463 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d9b432_370b_4e02_af61_f3f0163e829c.slice/crio-4356f07f7f04de5fd9428c95631d2607778e5a02780d378059eebff0ff941975 WatchSource:0}: Error finding container 4356f07f7f04de5fd9428c95631d2607778e5a02780d378059eebff0ff941975: Status 404 returned error can't find the container with id 4356f07f7f04de5fd9428c95631d2607778e5a02780d378059eebff0ff941975 Apr 22 18:50:31.726139 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:31.726103 2577 generic.go:358] "Generic (PLEG): container finished" podID="15d9b432-370b-4e02-af61-f3f0163e829c" containerID="a749682749a036b141ffa053fc3cd8f280286cd4d4a9b87b6b445afcc568f69d" exitCode=0 Apr 22 18:50:31.726493 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:31.726177 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerDied","Data":"a749682749a036b141ffa053fc3cd8f280286cd4d4a9b87b6b445afcc568f69d"} Apr 22 18:50:31.726493 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:31.726202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"4356f07f7f04de5fd9428c95631d2607778e5a02780d378059eebff0ff941975"} Apr 22 18:50:32.731882 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.731846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"9e79c38cd13e8de836a923248e3a32cc776fb0d18ec69345be6d0e75d7ffeca8"} Apr 22 18:50:32.731882 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.731884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"2a31ce0e462cdbd4da4d2d9ca1dfa626df17dbd0c5f90d1bbc98887ae92bd6e9"} Apr 22 18:50:32.731882 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.731893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"cd779a8553387edf43705bc103c38a101f7b6d0f66512537a74f68abdb81d1aa"} Apr 22 18:50:32.732431 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.731901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"f1a642f1cf7a613bbf8a55e2028aaddcf7659806147a240c92cbf06a886c1a00"} Apr 22 18:50:32.732431 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.731910 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"4345b5b127d9f105206cc068f484f332e820f874ac146537855726b004a2c57b"} Apr 22 18:50:32.732431 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.731918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"15d9b432-370b-4e02-af61-f3f0163e829c","Type":"ContainerStarted","Data":"e536310f859ad09b0ed134ebb4c9f714d88844468ca0750c00ed8ab8a0f17801"} Apr 22 18:50:32.758731 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:32.758677 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.758661379 podStartE2EDuration="2.758661379s" podCreationTimestamp="2026-04-22 18:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:32.755944149 +0000 UTC m=+220.280607706" watchObservedRunningTime="2026-04-22 18:50:32.758661379 +0000 UTC m=+220.283324929" Apr 22 18:50:37.846184 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:37.846142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:37.846184 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:37.846193 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:37.850839 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:37.850812 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:38.754151 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:38.754119 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:50:38.797829 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:38.797798 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-655d7c87f4-6lndf"] Apr 22 18:50:48.300770 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.300734 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mlrzp"] Apr 22 18:50:48.304114 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.304098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.307082 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.307050 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:50:48.308436 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.308413 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mlrzp"] Apr 22 18:50:48.400434 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.400395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31f9f33a-79c4-425d-991c-7eb42f160268-original-pull-secret\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.400434 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.400439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/31f9f33a-79c4-425d-991c-7eb42f160268-dbus\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.400630 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.400537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/31f9f33a-79c4-425d-991c-7eb42f160268-kubelet-config\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.501492 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.501456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/31f9f33a-79c4-425d-991c-7eb42f160268-kubelet-config\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.501628 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.501514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31f9f33a-79c4-425d-991c-7eb42f160268-original-pull-secret\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.501628 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.501544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/31f9f33a-79c4-425d-991c-7eb42f160268-dbus\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.501628 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.501581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/31f9f33a-79c4-425d-991c-7eb42f160268-kubelet-config\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.501738 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.501691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/31f9f33a-79c4-425d-991c-7eb42f160268-dbus\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.503947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.503924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31f9f33a-79c4-425d-991c-7eb42f160268-original-pull-secret\") pod \"global-pull-secret-syncer-mlrzp\" (UID: \"31f9f33a-79c4-425d-991c-7eb42f160268\") " pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.615078 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.614986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mlrzp" Apr 22 18:50:48.733348 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.733212 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mlrzp"] Apr 22 18:50:48.735904 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:50:48.735871 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f9f33a_79c4_425d_991c_7eb42f160268.slice/crio-af16c1d5b1e347956fa7a751c0a374071478608c962be2a63a9455376ebbb603 WatchSource:0}: Error finding container af16c1d5b1e347956fa7a751c0a374071478608c962be2a63a9455376ebbb603: Status 404 returned error can't find the container with id af16c1d5b1e347956fa7a751c0a374071478608c962be2a63a9455376ebbb603 Apr 22 18:50:48.779988 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:48.779955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mlrzp" event={"ID":"31f9f33a-79c4-425d-991c-7eb42f160268","Type":"ContainerStarted","Data":"af16c1d5b1e347956fa7a751c0a374071478608c962be2a63a9455376ebbb603"} Apr 22 18:50:52.794325 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:52.794289 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mlrzp" event={"ID":"31f9f33a-79c4-425d-991c-7eb42f160268","Type":"ContainerStarted","Data":"afb8142768793758c110b2c5a1ef3a9f66e1bfe26cefc6919951957c45996b90"} Apr 22 18:50:52.809818 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:50:52.809772 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mlrzp" podStartSLOduration=1.109982898 podStartE2EDuration="4.8097595s" podCreationTimestamp="2026-04-22 18:50:48 +0000 UTC" firstStartedPulling="2026-04-22 18:50:48.737526403 +0000 UTC m=+236.262189932" lastFinishedPulling="2026-04-22 18:50:52.437303004 +0000 UTC m=+239.961966534" observedRunningTime="2026-04-22 18:50:52.808225344 +0000 UTC m=+240.332888898" watchObservedRunningTime="2026-04-22 18:50:52.8097595 +0000 UTC m=+240.334423052" Apr 22 18:51:03.817862 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:03.817796 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-655d7c87f4-6lndf" podUID="c2b57e2f-312a-4e24-92d6-8210bfb5d014" containerName="console" containerID="cri-o://af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be" gracePeriod=15 Apr 22 18:51:04.057198 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.057176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-655d7c87f4-6lndf_c2b57e2f-312a-4e24-92d6-8210bfb5d014/console/0.log" Apr 22 18:51:04.057328 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.057235 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:51:04.141602 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141526 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-service-ca\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.141602 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141575 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft7d6\" (UniqueName: \"kubernetes.io/projected/c2b57e2f-312a-4e24-92d6-8210bfb5d014-kube-api-access-ft7d6\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.141817 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141608 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-serving-cert\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.141817 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141639 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-oauth-serving-cert\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.141817 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141663 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-oauth-config\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.141817 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141695 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-trusted-ca-bundle\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.141817 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.141711 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-config\") pod \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\" (UID: \"c2b57e2f-312a-4e24-92d6-8210bfb5d014\") " Apr 22 18:51:04.142536 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.142469 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-service-ca" (OuterVolumeSpecName: "service-ca") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:04.142536 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.142494 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:04.142824 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.142585 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:04.142892 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.142853 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-config" (OuterVolumeSpecName: "console-config") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:04.148925 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.148894 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b57e2f-312a-4e24-92d6-8210bfb5d014-kube-api-access-ft7d6" (OuterVolumeSpecName: "kube-api-access-ft7d6") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "kube-api-access-ft7d6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:04.149093 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.149034 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:04.150406 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.150382 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c2b57e2f-312a-4e24-92d6-8210bfb5d014" (UID: "c2b57e2f-312a-4e24-92d6-8210bfb5d014"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:04.242715 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242687 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-oauth-serving-cert\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.242715 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242711 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-oauth-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.242715 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242721 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-trusted-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.242908 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242730 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.242908 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242738 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2b57e2f-312a-4e24-92d6-8210bfb5d014-service-ca\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.242908 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242746 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ft7d6\" (UniqueName: \"kubernetes.io/projected/c2b57e2f-312a-4e24-92d6-8210bfb5d014-kube-api-access-ft7d6\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.242908 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.242755 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b57e2f-312a-4e24-92d6-8210bfb5d014-console-serving-cert\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:51:04.832208 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.832181 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-655d7c87f4-6lndf_c2b57e2f-312a-4e24-92d6-8210bfb5d014/console/0.log" Apr 22 18:51:04.832603 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.832219 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2b57e2f-312a-4e24-92d6-8210bfb5d014" containerID="af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be" exitCode=2 Apr 22 18:51:04.832603 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.832255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655d7c87f4-6lndf" event={"ID":"c2b57e2f-312a-4e24-92d6-8210bfb5d014","Type":"ContainerDied","Data":"af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be"} Apr 22 18:51:04.832603 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.832320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655d7c87f4-6lndf" event={"ID":"c2b57e2f-312a-4e24-92d6-8210bfb5d014","Type":"ContainerDied","Data":"59a8a980e4dd986fab25f12e4c7788e6f33256804e436e9353a226d26b955f32"} Apr 22 18:51:04.832603 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.832339 2577 scope.go:117] "RemoveContainer" containerID="af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be" Apr 22 18:51:04.832603 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.832338 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655d7c87f4-6lndf" Apr 22 18:51:04.841144 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.841126 2577 scope.go:117] "RemoveContainer" containerID="af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be" Apr 22 18:51:04.841410 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:51:04.841389 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be\": container with ID starting with af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be not found: ID does not exist" containerID="af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be" Apr 22 18:51:04.841481 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.841422 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be"} err="failed to get container status \"af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be\": rpc error: code = NotFound desc = could not find container \"af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be\": container with ID starting with af321bc0f3222c0fdfb8a6f43d723414984cf801b212ff01491951d1f2fd87be not found: ID does not exist" Apr 22 18:51:04.852846 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.852821 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-655d7c87f4-6lndf"] Apr 22 18:51:04.858396 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.858377 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-655d7c87f4-6lndf"] Apr 22 18:51:04.989951 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:04.989909 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b57e2f-312a-4e24-92d6-8210bfb5d014" path="/var/lib/kubelet/pods/c2b57e2f-312a-4e24-92d6-8210bfb5d014/volumes" Apr 22 18:51:52.862410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:52.862369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:51:52.863999 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:52.863978 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:51:52.871674 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:52.871657 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:51:53.130173 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.130103 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx"] Apr 22 18:51:53.132262 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.130482 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2b57e2f-312a-4e24-92d6-8210bfb5d014" containerName="console" Apr 22 18:51:53.132262 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.130495 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b57e2f-312a-4e24-92d6-8210bfb5d014" containerName="console" Apr 22 18:51:53.132262 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.130544 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2b57e2f-312a-4e24-92d6-8210bfb5d014" containerName="console" Apr 22 18:51:53.133393 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.133376 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.136076 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.136052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:51:53.137105 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.137069 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:51:53.137204 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.137077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rjz2k\"" Apr 22 18:51:53.141966 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.141944 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx"] Apr 22 18:51:53.161732 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.161707 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.161878 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.161747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smlds\" (UniqueName: \"kubernetes.io/projected/6f630a62-41e4-4cc5-98bf-35505b39d2da-kube-api-access-smlds\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.161878 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.161849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.262426 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.262395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.262596 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.262451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.262596 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.262484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smlds\" (UniqueName: \"kubernetes.io/projected/6f630a62-41e4-4cc5-98bf-35505b39d2da-kube-api-access-smlds\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.262774 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.262754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.262853 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.262834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.270468 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.270427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smlds\" (UniqueName: \"kubernetes.io/projected/6f630a62-41e4-4cc5-98bf-35505b39d2da-kube-api-access-smlds\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.444580 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.444504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:51:53.768558 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.768527 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx"] Apr 22 18:51:53.770919 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:51:53.770888 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f630a62_41e4_4cc5_98bf_35505b39d2da.slice/crio-653406b203ccf4710ffd1b104498ab14ca7f6e9434a0c11ddccbadb4a2d6db72 WatchSource:0}: Error finding container 653406b203ccf4710ffd1b104498ab14ca7f6e9434a0c11ddccbadb4a2d6db72: Status 404 returned error can't find the container with id 653406b203ccf4710ffd1b104498ab14ca7f6e9434a0c11ddccbadb4a2d6db72 Apr 22 18:51:53.972188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:53.972152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" event={"ID":"6f630a62-41e4-4cc5-98bf-35505b39d2da","Type":"ContainerStarted","Data":"653406b203ccf4710ffd1b104498ab14ca7f6e9434a0c11ddccbadb4a2d6db72"} Apr 22 18:51:59.993083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:59.993045 2577 generic.go:358] "Generic (PLEG): container finished" podID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerID="d110751a1cba944d61f71f73c999f9cf91f6ccc4e178dbdaaad0f7cd77a2e1ac" exitCode=0 Apr 22 18:51:59.993517 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:59.993092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" event={"ID":"6f630a62-41e4-4cc5-98bf-35505b39d2da","Type":"ContainerDied","Data":"d110751a1cba944d61f71f73c999f9cf91f6ccc4e178dbdaaad0f7cd77a2e1ac"} Apr 22 18:51:59.994057 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:51:59.994040 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:52:03.002672 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:03.002636 2577 generic.go:358] "Generic (PLEG): container finished" podID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerID="386cf1f75e037d5765a58677584be393f5b9f6289c239171c89736e7520ac84d" exitCode=0 Apr 22 18:52:03.003077 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:03.002726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" event={"ID":"6f630a62-41e4-4cc5-98bf-35505b39d2da","Type":"ContainerDied","Data":"386cf1f75e037d5765a58677584be393f5b9f6289c239171c89736e7520ac84d"} Apr 22 18:52:10.026465 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:10.026427 2577 generic.go:358] "Generic (PLEG): container finished" podID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerID="9a0962509169b2012bb0e751ce2abd6d3d07921ffc2ada82cc32dec368dede2a" exitCode=0 Apr 22 18:52:10.026858 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:10.026499 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" event={"ID":"6f630a62-41e4-4cc5-98bf-35505b39d2da","Type":"ContainerDied","Data":"9a0962509169b2012bb0e751ce2abd6d3d07921ffc2ada82cc32dec368dede2a"} Apr 22 18:52:11.154097 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.154069 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:52:11.222371 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.222339 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smlds\" (UniqueName: \"kubernetes.io/projected/6f630a62-41e4-4cc5-98bf-35505b39d2da-kube-api-access-smlds\") pod \"6f630a62-41e4-4cc5-98bf-35505b39d2da\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " Apr 22 18:52:11.222544 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.222395 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-bundle\") pod \"6f630a62-41e4-4cc5-98bf-35505b39d2da\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " Apr 22 18:52:11.222544 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.222426 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-util\") pod \"6f630a62-41e4-4cc5-98bf-35505b39d2da\" (UID: \"6f630a62-41e4-4cc5-98bf-35505b39d2da\") " Apr 22 18:52:11.222947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.222919 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-bundle" (OuterVolumeSpecName: "bundle") pod "6f630a62-41e4-4cc5-98bf-35505b39d2da" (UID: "6f630a62-41e4-4cc5-98bf-35505b39d2da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:11.224594 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.224560 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f630a62-41e4-4cc5-98bf-35505b39d2da-kube-api-access-smlds" (OuterVolumeSpecName: "kube-api-access-smlds") pod "6f630a62-41e4-4cc5-98bf-35505b39d2da" (UID: "6f630a62-41e4-4cc5-98bf-35505b39d2da"). InnerVolumeSpecName "kube-api-access-smlds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:11.226427 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.226405 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-util" (OuterVolumeSpecName: "util") pod "6f630a62-41e4-4cc5-98bf-35505b39d2da" (UID: "6f630a62-41e4-4cc5-98bf-35505b39d2da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:11.323197 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.323103 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smlds\" (UniqueName: \"kubernetes.io/projected/6f630a62-41e4-4cc5-98bf-35505b39d2da-kube-api-access-smlds\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.323197 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.323144 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.323197 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:11.323154 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f630a62-41e4-4cc5-98bf-35505b39d2da-util\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:52:12.033569 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:12.033542 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" Apr 22 18:52:12.033741 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:12.033543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czv7xx" event={"ID":"6f630a62-41e4-4cc5-98bf-35505b39d2da","Type":"ContainerDied","Data":"653406b203ccf4710ffd1b104498ab14ca7f6e9434a0c11ddccbadb4a2d6db72"} Apr 22 18:52:12.033741 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:12.033646 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="653406b203ccf4710ffd1b104498ab14ca7f6e9434a0c11ddccbadb4a2d6db72" Apr 22 18:52:19.222123 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222083 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b"] Apr 22 18:52:19.222701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222582 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="extract" Apr 22 18:52:19.222701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222602 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="extract" Apr 22 18:52:19.222701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222636 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="pull" Apr 22 18:52:19.222701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222647 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="pull" Apr 22 18:52:19.222701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222669 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="util" Apr 22 18:52:19.222701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222678 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="util" Apr 22 18:52:19.222993 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.222757 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f630a62-41e4-4cc5-98bf-35505b39d2da" containerName="extract" Apr 22 18:52:19.277915 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.277887 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b"] Apr 22 18:52:19.278078 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.278003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.280655 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.280631 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:52:19.280804 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.280671 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:52:19.281752 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.281726 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:52:19.281752 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.281749 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:52:19.281923 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.281791 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:52:19.281923 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.281750 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-lwngs\"" Apr 22 18:52:19.391947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.391913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.392120 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.391959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.392120 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.392002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpj8\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-kube-api-access-tkpj8\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.407861 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.407834 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-nghqd"] Apr 22 18:52:19.433008 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.432978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nghqd"] Apr 22 18:52:19.433152 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.433092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:19.435479 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.435457 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:52:19.492659 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.492590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.492659 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.492625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.492659 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.492655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgj8h\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-kube-api-access-tgj8h\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:19.492926 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.492686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpj8\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-kube-api-access-tkpj8\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.492926 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.492714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-certificates\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:19.492926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.492745 2577 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:19.492926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.492765 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:19.492926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.492785 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b: references non-existent secret key: tls.crt Apr 22 18:52:19.492926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.492843 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates podName:bf056d48-a396-4c93-a46f-b38f0ce0c2b5 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:19.992828129 +0000 UTC m=+327.517491659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates") pod "keda-metrics-apiserver-7c9f485588-q8x6b" (UID: "bf056d48-a396-4c93-a46f-b38f0ce0c2b5") : references non-existent secret key: tls.crt Apr 22 18:52:19.493200 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.493081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.501873 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.501850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpj8\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-kube-api-access-tkpj8\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.593742 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.593714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgj8h\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-kube-api-access-tgj8h\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:19.593925 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.593758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-certificates\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:19.593925 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.593886 2577 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 18:52:19.593925 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.593904 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-nghqd: secret "keda-admission-webhooks-certs" not found Apr 22 18:52:19.594088 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.593949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-certificates podName:63064034-0544-47eb-b926-10680b212579 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:20.09393656 +0000 UTC m=+327.618600090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-certificates") pod "keda-admission-cf49989db-nghqd" (UID: "63064034-0544-47eb-b926-10680b212579") : secret "keda-admission-webhooks-certs" not found Apr 22 18:52:19.602837 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.602815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgj8h\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-kube-api-access-tgj8h\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:19.996740 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:19.996706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:19.996926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.996850 2577 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:19.996926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.996872 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:19.996926 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.996893 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b: references non-existent secret key: tls.crt Apr 22 18:52:19.997042 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:19.996955 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates podName:bf056d48-a396-4c93-a46f-b38f0ce0c2b5 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:20.996938555 +0000 UTC m=+328.521602085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates") pod "keda-metrics-apiserver-7c9f485588-q8x6b" (UID: "bf056d48-a396-4c93-a46f-b38f0ce0c2b5") : references non-existent secret key: tls.crt Apr 22 18:52:20.097478 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:20.097439 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-certificates\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:20.099989 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:20.099956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/63064034-0544-47eb-b926-10680b212579-certificates\") pod \"keda-admission-cf49989db-nghqd\" (UID: \"63064034-0544-47eb-b926-10680b212579\") " pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:20.343294 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:20.343183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:20.469883 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:20.469852 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nghqd"] Apr 22 18:52:20.473115 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:52:20.473090 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63064034_0544_47eb_b926_10680b212579.slice/crio-cd6e8678413c8c7719c7cb38d39294856a293882ebad9cad833a9f6dbb82a3d3 WatchSource:0}: Error finding container cd6e8678413c8c7719c7cb38d39294856a293882ebad9cad833a9f6dbb82a3d3: Status 404 returned error can't find the container with id cd6e8678413c8c7719c7cb38d39294856a293882ebad9cad833a9f6dbb82a3d3 Apr 22 18:52:21.005812 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:21.005782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:21.005975 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:21.005914 2577 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:21.005975 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:21.005930 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:21.005975 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:21.005947 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b: references non-existent secret key: tls.crt Apr 22 18:52:21.006082 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:52:21.005995 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates podName:bf056d48-a396-4c93-a46f-b38f0ce0c2b5 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:23.005981659 +0000 UTC m=+330.530645188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates") pod "keda-metrics-apiserver-7c9f485588-q8x6b" (UID: "bf056d48-a396-4c93-a46f-b38f0ce0c2b5") : references non-existent secret key: tls.crt Apr 22 18:52:21.060571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:21.060540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nghqd" event={"ID":"63064034-0544-47eb-b926-10680b212579","Type":"ContainerStarted","Data":"cd6e8678413c8c7719c7cb38d39294856a293882ebad9cad833a9f6dbb82a3d3"} Apr 22 18:52:23.024312 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.024260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:23.026688 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.026659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf056d48-a396-4c93-a46f-b38f0ce0c2b5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q8x6b\" (UID: \"bf056d48-a396-4c93-a46f-b38f0ce0c2b5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:23.067772 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.067740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nghqd" event={"ID":"63064034-0544-47eb-b926-10680b212579","Type":"ContainerStarted","Data":"b72d8cd6214222ffb2779ba2517bb513250e2a645ef84ef9791fb3a74add4a9b"} Apr 22 18:52:23.067905 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.067864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:52:23.084262 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.084189 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-nghqd" podStartSLOduration=2.355775634 podStartE2EDuration="4.084174616s" podCreationTimestamp="2026-04-22 18:52:19 +0000 UTC" firstStartedPulling="2026-04-22 18:52:20.474449859 +0000 UTC m=+327.999113391" lastFinishedPulling="2026-04-22 18:52:22.202848841 +0000 UTC m=+329.727512373" observedRunningTime="2026-04-22 18:52:23.082168906 +0000 UTC m=+330.606832468" watchObservedRunningTime="2026-04-22 18:52:23.084174616 +0000 UTC m=+330.608838202" Apr 22 18:52:23.188004 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.187972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:23.305069 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:23.304995 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b"] Apr 22 18:52:23.308477 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:52:23.308448 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf056d48_a396_4c93_a46f_b38f0ce0c2b5.slice/crio-b3fd0103276e5febf18f5e42c6aaa00f8b2638a10ed05e436a75ad1d6e8878e8 WatchSource:0}: Error finding container b3fd0103276e5febf18f5e42c6aaa00f8b2638a10ed05e436a75ad1d6e8878e8: Status 404 returned error can't find the container with id b3fd0103276e5febf18f5e42c6aaa00f8b2638a10ed05e436a75ad1d6e8878e8 Apr 22 18:52:24.072060 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:24.072022 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" event={"ID":"bf056d48-a396-4c93-a46f-b38f0ce0c2b5","Type":"ContainerStarted","Data":"b3fd0103276e5febf18f5e42c6aaa00f8b2638a10ed05e436a75ad1d6e8878e8"} Apr 22 18:52:27.083343 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:27.083309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" event={"ID":"bf056d48-a396-4c93-a46f-b38f0ce0c2b5","Type":"ContainerStarted","Data":"1a58d980c6d2d5d983e835cb9fd5ba0203f7481adc746147981465c28f31f1c4"} Apr 22 18:52:27.083774 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:27.083382 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:27.101366 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:27.101314 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" podStartSLOduration=5.361599212 podStartE2EDuration="8.101298076s" podCreationTimestamp="2026-04-22 18:52:19 +0000 UTC" firstStartedPulling="2026-04-22 18:52:23.309784208 +0000 UTC m=+330.834447741" lastFinishedPulling="2026-04-22 18:52:26.049483076 +0000 UTC m=+333.574146605" observedRunningTime="2026-04-22 18:52:27.100337974 +0000 UTC m=+334.625001527" watchObservedRunningTime="2026-04-22 18:52:27.101298076 +0000 UTC m=+334.625961666" Apr 22 18:52:38.090718 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:38.090689 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q8x6b" Apr 22 18:52:44.074797 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:52:44.074722 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-nghqd" Apr 22 18:53:25.273418 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.273384 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l"] Apr 22 18:53:25.275696 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.275674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.278213 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.278190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:53:25.278344 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.278310 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-cmxhh\"" Apr 22 18:53:25.278669 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.278655 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:53:25.279403 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.279386 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:53:25.289600 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.289575 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l"] Apr 22 18:53:25.361455 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.361416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91627a29-eef4-43fa-b35c-2b84ba4baf93-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.361621 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.361554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gwkc\" (UniqueName: \"kubernetes.io/projected/91627a29-eef4-43fa-b35c-2b84ba4baf93-kube-api-access-7gwkc\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.462023 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.461983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gwkc\" (UniqueName: \"kubernetes.io/projected/91627a29-eef4-43fa-b35c-2b84ba4baf93-kube-api-access-7gwkc\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.462218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.462041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91627a29-eef4-43fa-b35c-2b84ba4baf93-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.462218 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:53:25.462152 2577 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 18:53:25.462218 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:53:25.462215 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91627a29-eef4-43fa-b35c-2b84ba4baf93-cert podName:91627a29-eef4-43fa-b35c-2b84ba4baf93 nodeName:}" failed. No retries permitted until 2026-04-22 18:53:25.96219629 +0000 UTC m=+393.486859819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91627a29-eef4-43fa-b35c-2b84ba4baf93-cert") pod "llmisvc-controller-manager-68cc5db7c4-wpz6l" (UID: "91627a29-eef4-43fa-b35c-2b84ba4baf93") : secret "llmisvc-webhook-server-cert" not found Apr 22 18:53:25.471534 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.471511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gwkc\" (UniqueName: \"kubernetes.io/projected/91627a29-eef4-43fa-b35c-2b84ba4baf93-kube-api-access-7gwkc\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.967223 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.967180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91627a29-eef4-43fa-b35c-2b84ba4baf93-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:25.969546 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:25.969525 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91627a29-eef4-43fa-b35c-2b84ba4baf93-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wpz6l\" (UID: \"91627a29-eef4-43fa-b35c-2b84ba4baf93\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:26.186556 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:26.186518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:26.313324 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:26.313297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l"] Apr 22 18:53:26.315803 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:53:26.315776 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod91627a29_eef4_43fa_b35c_2b84ba4baf93.slice/crio-22f6ed971c3ce09214041daee6540b04a88dfac9b1286219963235b9cc80ff63 WatchSource:0}: Error finding container 22f6ed971c3ce09214041daee6540b04a88dfac9b1286219963235b9cc80ff63: Status 404 returned error can't find the container with id 22f6ed971c3ce09214041daee6540b04a88dfac9b1286219963235b9cc80ff63 Apr 22 18:53:27.264221 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:27.264177 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" event={"ID":"91627a29-eef4-43fa-b35c-2b84ba4baf93","Type":"ContainerStarted","Data":"22f6ed971c3ce09214041daee6540b04a88dfac9b1286219963235b9cc80ff63"} Apr 22 18:53:28.268725 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:28.268689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" event={"ID":"91627a29-eef4-43fa-b35c-2b84ba4baf93","Type":"ContainerStarted","Data":"5ceeb92de88580a0a9f6f8efc44c39f1aecb1e0c2178aaeeeacfbd9f9cdbce59"} Apr 22 18:53:28.269093 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:28.268751 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:53:28.284746 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:28.284698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" podStartSLOduration=1.447003072 podStartE2EDuration="3.284684923s" podCreationTimestamp="2026-04-22 18:53:25 +0000 UTC" firstStartedPulling="2026-04-22 18:53:26.317041374 +0000 UTC m=+393.841704904" lastFinishedPulling="2026-04-22 18:53:28.154723014 +0000 UTC m=+395.679386755" observedRunningTime="2026-04-22 18:53:28.283879629 +0000 UTC m=+395.808543193" watchObservedRunningTime="2026-04-22 18:53:28.284684923 +0000 UTC m=+395.809348474" Apr 22 18:53:59.275418 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:53:59.275381 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wpz6l" Apr 22 18:54:34.517233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.517196 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-764c74bbf9-x9t8c"] Apr 22 18:54:34.521114 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.521089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.532950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.532907 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764c74bbf9-x9t8c"] Apr 22 18:54:34.546416 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-service-ca\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.546552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-trusted-ca-bundle\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.546552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab239c5f-2846-4e56-9332-24bbdc368b27-console-oauth-config\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.546552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxvj\" (UniqueName: \"kubernetes.io/projected/ab239c5f-2846-4e56-9332-24bbdc368b27-kube-api-access-prxvj\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.546701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-oauth-serving-cert\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.546701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab239c5f-2846-4e56-9332-24bbdc368b27-console-serving-cert\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.546701 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.546686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-console-config\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647518 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prxvj\" (UniqueName: \"kubernetes.io/projected/ab239c5f-2846-4e56-9332-24bbdc368b27-kube-api-access-prxvj\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-oauth-serving-cert\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab239c5f-2846-4e56-9332-24bbdc368b27-console-serving-cert\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-console-config\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-service-ca\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-trusted-ca-bundle\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.647982 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.647732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab239c5f-2846-4e56-9332-24bbdc368b27-console-oauth-config\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.648412 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.648381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-oauth-serving-cert\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.648552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.648381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-console-config\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.648552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.648519 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-service-ca\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.648552 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.648543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab239c5f-2846-4e56-9332-24bbdc368b27-trusted-ca-bundle\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.650185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.650169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab239c5f-2846-4e56-9332-24bbdc368b27-console-serving-cert\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.650241 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.650203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab239c5f-2846-4e56-9332-24bbdc368b27-console-oauth-config\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.656031 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.656009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxvj\" (UniqueName: \"kubernetes.io/projected/ab239c5f-2846-4e56-9332-24bbdc368b27-kube-api-access-prxvj\") pod \"console-764c74bbf9-x9t8c\" (UID: \"ab239c5f-2846-4e56-9332-24bbdc368b27\") " pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.832935 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.832845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:34.970424 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:34.970396 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764c74bbf9-x9t8c"] Apr 22 18:54:34.972712 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:54:34.972684 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab239c5f_2846_4e56_9332_24bbdc368b27.slice/crio-21a32a8f502cd894a5b6a0307de046c98dd99f7c53d262bb4c629c91413d3120 WatchSource:0}: Error finding container 21a32a8f502cd894a5b6a0307de046c98dd99f7c53d262bb4c629c91413d3120: Status 404 returned error can't find the container with id 21a32a8f502cd894a5b6a0307de046c98dd99f7c53d262bb4c629c91413d3120 Apr 22 18:54:35.499639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.499599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764c74bbf9-x9t8c" event={"ID":"ab239c5f-2846-4e56-9332-24bbdc368b27","Type":"ContainerStarted","Data":"b3ecc598768276ab651942768f7e17f3fd52b0c7b483e3ddad76408da738060c"} Apr 22 18:54:35.499639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.499643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764c74bbf9-x9t8c" event={"ID":"ab239c5f-2846-4e56-9332-24bbdc368b27","Type":"ContainerStarted","Data":"21a32a8f502cd894a5b6a0307de046c98dd99f7c53d262bb4c629c91413d3120"} Apr 22 18:54:35.519153 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.519080 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764c74bbf9-x9t8c" podStartSLOduration=1.519061577 podStartE2EDuration="1.519061577s" podCreationTimestamp="2026-04-22 18:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:35.518216755 +0000 UTC m=+463.042880308" watchObservedRunningTime="2026-04-22 18:54:35.519061577 +0000 UTC m=+463.043725129" Apr 22 18:54:35.702589 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.702557 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-kv8pn"] Apr 22 18:54:35.706078 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.706057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:35.708789 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.708764 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:54:35.708898 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.708779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-8cmlf\"" Apr 22 18:54:35.715088 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.715062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kv8pn"] Apr 22 18:54:35.757810 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.757733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56j52\" (UniqueName: \"kubernetes.io/projected/9c01586d-bde7-474a-9ba6-df24324623f0-kube-api-access-56j52\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:35.757941 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.757827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:35.858409 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.858377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:35.858594 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.858426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56j52\" (UniqueName: \"kubernetes.io/projected/9c01586d-bde7-474a-9ba6-df24324623f0-kube-api-access-56j52\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:35.858594 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:54:35.858552 2577 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:54:35.858716 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:54:35.858626 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert podName:9c01586d-bde7-474a-9ba6-df24324623f0 nodeName:}" failed. No retries permitted until 2026-04-22 18:54:36.358605461 +0000 UTC m=+463.883268994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert") pod "odh-model-controller-696fc77849-kv8pn" (UID: "9c01586d-bde7-474a-9ba6-df24324623f0") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:54:35.887678 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:35.887649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56j52\" (UniqueName: \"kubernetes.io/projected/9c01586d-bde7-474a-9ba6-df24324623f0-kube-api-access-56j52\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:36.365090 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:36.365060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:36.365278 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:54:36.365202 2577 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:54:36.365329 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:54:36.365291 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert podName:9c01586d-bde7-474a-9ba6-df24324623f0 nodeName:}" failed. No retries permitted until 2026-04-22 18:54:37.365260354 +0000 UTC m=+464.889923882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert") pod "odh-model-controller-696fc77849-kv8pn" (UID: "9c01586d-bde7-474a-9ba6-df24324623f0") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:54:37.372918 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:37.372881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:37.375258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:37.375235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c01586d-bde7-474a-9ba6-df24324623f0-cert\") pod \"odh-model-controller-696fc77849-kv8pn\" (UID: \"9c01586d-bde7-474a-9ba6-df24324623f0\") " pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:37.517484 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:37.517452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:37.633208 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:37.633172 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kv8pn"] Apr 22 18:54:37.635667 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:54:37.635639 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c01586d_bde7_474a_9ba6_df24324623f0.slice/crio-99ab5a80dafb1b737b09cacf95fe04b478f644b8368c3fd48cfae9f9d498d01a WatchSource:0}: Error finding container 99ab5a80dafb1b737b09cacf95fe04b478f644b8368c3fd48cfae9f9d498d01a: Status 404 returned error can't find the container with id 99ab5a80dafb1b737b09cacf95fe04b478f644b8368c3fd48cfae9f9d498d01a Apr 22 18:54:38.512788 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:38.512736 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kv8pn" event={"ID":"9c01586d-bde7-474a-9ba6-df24324623f0","Type":"ContainerStarted","Data":"99ab5a80dafb1b737b09cacf95fe04b478f644b8368c3fd48cfae9f9d498d01a"} Apr 22 18:54:41.526950 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:41.526906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kv8pn" event={"ID":"9c01586d-bde7-474a-9ba6-df24324623f0","Type":"ContainerStarted","Data":"46be3c9b005f95464a04e9c472b60f63fbd35b07432afc798d3146b7b629d3a4"} Apr 22 18:54:41.527373 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:41.526982 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:54:41.542671 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:41.542615 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-kv8pn" podStartSLOduration=3.36996387 podStartE2EDuration="6.542597577s" podCreationTimestamp="2026-04-22 18:54:35 +0000 UTC" firstStartedPulling="2026-04-22 18:54:37.636854075 +0000 UTC m=+465.161517605" lastFinishedPulling="2026-04-22 18:54:40.809487773 +0000 UTC m=+468.334151312" observedRunningTime="2026-04-22 18:54:41.541296867 +0000 UTC m=+469.065960419" watchObservedRunningTime="2026-04-22 18:54:41.542597577 +0000 UTC m=+469.067261130" Apr 22 18:54:44.833917 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:44.833875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:44.833917 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:44.833911 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:44.839156 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:44.839130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:45.550744 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:45.550713 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-764c74bbf9-x9t8c" Apr 22 18:54:45.595126 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:45.595091 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dbb8f7464-g9lp7"] Apr 22 18:54:52.533820 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:54:52.533790 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-kv8pn" Apr 22 18:55:10.618807 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.618751 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5dbb8f7464-g9lp7" podUID="0474d268-addc-4b52-8abe-d8ed8a4284d3" containerName="console" containerID="cri-o://e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b" gracePeriod=15 Apr 22 18:55:10.855781 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.855757 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbb8f7464-g9lp7_0474d268-addc-4b52-8abe-d8ed8a4284d3/console/0.log" Apr 22 18:55:10.855909 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.855829 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:55:10.970668 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970639 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-service-ca\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.970863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970698 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-oauth-config\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.970863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970725 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh24h\" (UniqueName: \"kubernetes.io/projected/0474d268-addc-4b52-8abe-d8ed8a4284d3-kube-api-access-qh24h\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.970863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970743 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-trusted-ca-bundle\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.970863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970768 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-config\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.970863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970815 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-serving-cert\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.970863 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.970839 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-oauth-serving-cert\") pod \"0474d268-addc-4b52-8abe-d8ed8a4284d3\" (UID: \"0474d268-addc-4b52-8abe-d8ed8a4284d3\") " Apr 22 18:55:10.971291 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.971213 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:10.971408 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.971233 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:10.971408 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.971366 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-config" (OuterVolumeSpecName: "console-config") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:10.971408 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.971397 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:10.973094 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.973067 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0474d268-addc-4b52-8abe-d8ed8a4284d3-kube-api-access-qh24h" (OuterVolumeSpecName: "kube-api-access-qh24h") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "kube-api-access-qh24h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:10.973195 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.973081 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:10.973195 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:10.973131 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0474d268-addc-4b52-8abe-d8ed8a4284d3" (UID: "0474d268-addc-4b52-8abe-d8ed8a4284d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:11.071988 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.071952 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-service-ca\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.071988 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.071983 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-oauth-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.071988 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.071995 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qh24h\" (UniqueName: \"kubernetes.io/projected/0474d268-addc-4b52-8abe-d8ed8a4284d3-kube-api-access-qh24h\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.072226 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.072006 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-trusted-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.072226 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.072016 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.072226 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.072024 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0474d268-addc-4b52-8abe-d8ed8a4284d3-console-serving-cert\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.072226 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.072033 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0474d268-addc-4b52-8abe-d8ed8a4284d3-oauth-serving-cert\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:55:11.635442 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.635409 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbb8f7464-g9lp7_0474d268-addc-4b52-8abe-d8ed8a4284d3/console/0.log" Apr 22 18:55:11.635867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.635451 2577 generic.go:358] "Generic (PLEG): container finished" podID="0474d268-addc-4b52-8abe-d8ed8a4284d3" containerID="e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b" exitCode=2 Apr 22 18:55:11.635867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.635526 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbb8f7464-g9lp7" Apr 22 18:55:11.635867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.635539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbb8f7464-g9lp7" event={"ID":"0474d268-addc-4b52-8abe-d8ed8a4284d3","Type":"ContainerDied","Data":"e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b"} Apr 22 18:55:11.635867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.635577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbb8f7464-g9lp7" event={"ID":"0474d268-addc-4b52-8abe-d8ed8a4284d3","Type":"ContainerDied","Data":"d022e9ecb005c8d761ea46ffa2fb485751c7502d0a1a3024532c550d2cf01c7f"} Apr 22 18:55:11.635867 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.635592 2577 scope.go:117] "RemoveContainer" containerID="e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b" Apr 22 18:55:11.644371 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.644351 2577 scope.go:117] "RemoveContainer" containerID="e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b" Apr 22 18:55:11.644642 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:55:11.644623 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b\": container with ID starting with e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b not found: ID does not exist" containerID="e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b" Apr 22 18:55:11.644686 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.644654 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b"} err="failed to get container status \"e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b\": rpc error: code = NotFound desc = could not find container \"e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b\": container with ID starting with e3a05661f06a759a29a236158f660077d6e4a5bdc75492596e8d0323b60a128b not found: ID does not exist" Apr 22 18:55:11.653305 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.653261 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dbb8f7464-g9lp7"] Apr 22 18:55:11.656915 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:11.656891 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5dbb8f7464-g9lp7"] Apr 22 18:55:12.996468 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:12.996435 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0474d268-addc-4b52-8abe-d8ed8a4284d3" path="/var/lib/kubelet/pods/0474d268-addc-4b52-8abe-d8ed8a4284d3/volumes" Apr 22 18:55:13.423055 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.422974 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558"] Apr 22 18:55:13.423365 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.423344 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0474d268-addc-4b52-8abe-d8ed8a4284d3" containerName="console" Apr 22 18:55:13.423365 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.423362 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0474d268-addc-4b52-8abe-d8ed8a4284d3" containerName="console" Apr 22 18:55:13.423547 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.423424 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0474d268-addc-4b52-8abe-d8ed8a4284d3" containerName="console" Apr 22 18:55:13.428239 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.428216 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.430842 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.430812 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 22 18:55:13.430949 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.430893 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 22 18:55:13.430949 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.430904 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:55:13.431884 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.431858 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-n6c6z\"" Apr 22 18:55:13.431884 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.431873 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:55:13.438496 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.438472 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558"] Apr 22 18:55:13.513469 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.513435 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59"] Apr 22 18:55:13.517040 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.517017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.519391 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.519354 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9844b-predictor-serving-cert\"" Apr 22 18:55:13.519529 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.519468 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9844b-kube-rbac-proxy-sar-config\"" Apr 22 18:55:13.524431 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.524405 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59"] Apr 22 18:55:13.591956 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.591919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.592095 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.592005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcmf5\" (UniqueName: \"kubernetes.io/projected/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kube-api-access-pcmf5\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.592095 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.592040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.592095 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.592061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.693512 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.693512 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.693512 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-error-404-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.693759 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2g9\" (UniqueName: \"kubernetes.io/projected/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-kube-api-access-nv2g9\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.693759 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.693759 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.693904 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcmf5\" (UniqueName: \"kubernetes.io/projected/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kube-api-access-pcmf5\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.693904 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.693852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.694233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.694203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.696135 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.696109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.701811 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.701792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcmf5\" (UniqueName: \"kubernetes.io/projected/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kube-api-access-pcmf5\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-8q558\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.741919 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.741888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:13.794504 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.794475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2g9\" (UniqueName: \"kubernetes.io/projected/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-kube-api-access-nv2g9\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.794642 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.794562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.794704 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.794659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-error-404-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.795320 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:55:13.794862 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-9844b-predictor-serving-cert: secret "error-404-isvc-9844b-predictor-serving-cert" not found Apr 22 18:55:13.795320 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:55:13.794951 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls podName:eba18b2a-b04d-41df-a35c-7aded0b0ca0d nodeName:}" failed. No retries permitted until 2026-04-22 18:55:14.294926981 +0000 UTC m=+501.819590526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls") pod "error-404-isvc-9844b-predictor-57bf65897f-5ns59" (UID: "eba18b2a-b04d-41df-a35c-7aded0b0ca0d") : secret "error-404-isvc-9844b-predictor-serving-cert" not found Apr 22 18:55:13.795486 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.795432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-error-404-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.805108 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.805077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2g9\" (UniqueName: \"kubernetes.io/projected/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-kube-api-access-nv2g9\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:13.875231 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:13.874938 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558"] Apr 22 18:55:13.877390 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:55:13.877362 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ca8ccb_2c94_4c3f_9d22_4cfaed4bde53.slice/crio-328cc9104b201f0ae9280f38e849a82e93f13678140bdfac475d2378663e3091 WatchSource:0}: Error finding container 328cc9104b201f0ae9280f38e849a82e93f13678140bdfac475d2378663e3091: Status 404 returned error can't find the container with id 328cc9104b201f0ae9280f38e849a82e93f13678140bdfac475d2378663e3091 Apr 22 18:55:14.300089 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:14.300054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:14.302447 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:14.302427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls\") pod \"error-404-isvc-9844b-predictor-57bf65897f-5ns59\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:14.431843 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:14.431804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:14.562577 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:14.562542 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59"] Apr 22 18:55:14.564867 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:55:14.564842 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeba18b2a_b04d_41df_a35c_7aded0b0ca0d.slice/crio-83ddc88a516a6f844a736bee691454741407cb06fd4f1c55a9d3034677520ee7 WatchSource:0}: Error finding container 83ddc88a516a6f844a736bee691454741407cb06fd4f1c55a9d3034677520ee7: Status 404 returned error can't find the container with id 83ddc88a516a6f844a736bee691454741407cb06fd4f1c55a9d3034677520ee7 Apr 22 18:55:14.650481 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:14.650437 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" event={"ID":"eba18b2a-b04d-41df-a35c-7aded0b0ca0d","Type":"ContainerStarted","Data":"83ddc88a516a6f844a736bee691454741407cb06fd4f1c55a9d3034677520ee7"} Apr 22 18:55:14.651974 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:14.651941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerStarted","Data":"328cc9104b201f0ae9280f38e849a82e93f13678140bdfac475d2378663e3091"} Apr 22 18:55:28.725861 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:28.725810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerStarted","Data":"d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713"} Apr 22 18:55:28.727528 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:28.727493 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" event={"ID":"eba18b2a-b04d-41df-a35c-7aded0b0ca0d","Type":"ContainerStarted","Data":"f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723"} Apr 22 18:55:30.735788 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:30.735749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" event={"ID":"eba18b2a-b04d-41df-a35c-7aded0b0ca0d","Type":"ContainerStarted","Data":"81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a"} Apr 22 18:55:30.736225 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:30.735933 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:30.736225 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:30.735946 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:30.737105 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:30.737075 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 18:55:30.752085 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:30.752033 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podStartSLOduration=1.9649340419999999 podStartE2EDuration="17.752016881s" podCreationTimestamp="2026-04-22 18:55:13 +0000 UTC" firstStartedPulling="2026-04-22 18:55:14.566762985 +0000 UTC m=+502.091426514" lastFinishedPulling="2026-04-22 18:55:30.353845824 +0000 UTC m=+517.878509353" observedRunningTime="2026-04-22 18:55:30.751517181 +0000 UTC m=+518.276180733" watchObservedRunningTime="2026-04-22 18:55:30.752016881 +0000 UTC m=+518.276680443" Apr 22 18:55:31.739812 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:31.739771 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 18:55:32.743677 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:32.743642 2577 generic.go:358] "Generic (PLEG): container finished" podID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerID="d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713" exitCode=0 Apr 22 18:55:32.744156 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:32.743721 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerDied","Data":"d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713"} Apr 22 18:55:36.745887 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:36.745852 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:55:36.746444 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:36.746415 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 18:55:39.771832 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:39.771752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerStarted","Data":"894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947"} Apr 22 18:55:39.771832 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:39.771793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerStarted","Data":"2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee"} Apr 22 18:55:39.772246 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:39.772004 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:39.790880 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:39.790832 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podStartSLOduration=1.533969075 podStartE2EDuration="26.790792835s" podCreationTimestamp="2026-04-22 18:55:13 +0000 UTC" firstStartedPulling="2026-04-22 18:55:13.879155791 +0000 UTC m=+501.403819320" lastFinishedPulling="2026-04-22 18:55:39.135979551 +0000 UTC m=+526.660643080" observedRunningTime="2026-04-22 18:55:39.789494141 +0000 UTC m=+527.314157693" watchObservedRunningTime="2026-04-22 18:55:39.790792835 +0000 UTC m=+527.315456387" Apr 22 18:55:40.775498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:40.775468 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:40.776824 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:40.776794 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:55:41.778365 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:41.778330 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:55:46.747345 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:46.747300 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 18:55:46.782565 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:46.782537 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:55:46.783145 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:46.783114 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:55:56.746347 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:56.746303 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 18:55:56.783568 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:55:56.783534 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:56:06.746741 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:06.746698 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 18:56:06.783498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:06.783459 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:56:16.747157 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:16.747130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:56:16.783817 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:16.783779 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:56:26.783151 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:26.783112 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:56:33.223811 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.223772 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl"] Apr 22 18:56:33.240197 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.240166 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl"] Apr 22 18:56:33.240350 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.240326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.242827 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.242800 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9844b-serving-cert\"" Apr 22 18:56:33.242968 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.242806 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9844b-kube-rbac-proxy-sar-config\"" Apr 22 18:56:33.334428 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.334393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.334601 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.334554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d61a4cd2-d237-4f6a-b403-57164324e96c-openshift-service-ca-bundle\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.435911 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.435868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d61a4cd2-d237-4f6a-b403-57164324e96c-openshift-service-ca-bundle\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.435911 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.435918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.436179 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:56:33.436016 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-9844b-serving-cert: secret "switch-graph-9844b-serving-cert" not found Apr 22 18:56:33.436179 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:56:33.436069 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls podName:d61a4cd2-d237-4f6a-b403-57164324e96c nodeName:}" failed. No retries permitted until 2026-04-22 18:56:33.936053681 +0000 UTC m=+581.460717210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls") pod "switch-graph-9844b-8475b5564f-rnmkl" (UID: "d61a4cd2-d237-4f6a-b403-57164324e96c") : secret "switch-graph-9844b-serving-cert" not found Apr 22 18:56:33.436658 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.436631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d61a4cd2-d237-4f6a-b403-57164324e96c-openshift-service-ca-bundle\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.941577 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.941539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:33.943944 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:33.943925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls\") pod \"switch-graph-9844b-8475b5564f-rnmkl\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:34.150712 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:34.150675 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:34.275380 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:34.275352 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl"] Apr 22 18:56:34.277286 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:56:34.277246 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a4cd2_d237_4f6a_b403_57164324e96c.slice/crio-1f29e0e4758d7d67ff6e5c51dd465aeae88a4a3e84ec12bf2bc527522959f4a7 WatchSource:0}: Error finding container 1f29e0e4758d7d67ff6e5c51dd465aeae88a4a3e84ec12bf2bc527522959f4a7: Status 404 returned error can't find the container with id 1f29e0e4758d7d67ff6e5c51dd465aeae88a4a3e84ec12bf2bc527522959f4a7 Apr 22 18:56:34.963384 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:34.963339 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" event={"ID":"d61a4cd2-d237-4f6a-b403-57164324e96c","Type":"ContainerStarted","Data":"1f29e0e4758d7d67ff6e5c51dd465aeae88a4a3e84ec12bf2bc527522959f4a7"} Apr 22 18:56:36.783911 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:36.783868 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:56:37.974832 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:37.974794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" event={"ID":"d61a4cd2-d237-4f6a-b403-57164324e96c","Type":"ContainerStarted","Data":"77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4"} Apr 22 18:56:37.975238 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:37.974886 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:37.991498 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:37.991446 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podStartSLOduration=1.938525483 podStartE2EDuration="4.991432881s" podCreationTimestamp="2026-04-22 18:56:33 +0000 UTC" firstStartedPulling="2026-04-22 18:56:34.279136266 +0000 UTC m=+581.803799795" lastFinishedPulling="2026-04-22 18:56:37.332043663 +0000 UTC m=+584.856707193" observedRunningTime="2026-04-22 18:56:37.989134273 +0000 UTC m=+585.513797824" watchObservedRunningTime="2026-04-22 18:56:37.991432881 +0000 UTC m=+585.516096431" Apr 22 18:56:43.985086 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:43.985044 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:56:46.784083 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:46.784053 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:56:47.431954 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.431920 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl"] Apr 22 18:56:47.432259 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.432218 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" containerID="cri-o://77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4" gracePeriod=30 Apr 22 18:56:47.617664 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.617631 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59"] Apr 22 18:56:47.618236 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.617982 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" containerID="cri-o://f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723" gracePeriod=30 Apr 22 18:56:47.618236 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.618103 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kube-rbac-proxy" containerID="cri-o://81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a" gracePeriod=30 Apr 22 18:56:47.680262 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.680230 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v"] Apr 22 18:56:47.683729 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.683690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.686114 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.686093 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-70f3d-kube-rbac-proxy-sar-config\"" Apr 22 18:56:47.686219 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.686094 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-70f3d-predictor-serving-cert\"" Apr 22 18:56:47.692594 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.692571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v"] Apr 22 18:56:47.752455 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.752423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b913f2e-9481-4e91-9946-9d957f29ae31-error-404-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.752591 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.752458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn27j\" (UniqueName: \"kubernetes.io/projected/4b913f2e-9481-4e91-9946-9d957f29ae31-kube-api-access-xn27j\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.752591 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.752574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b913f2e-9481-4e91-9946-9d957f29ae31-proxy-tls\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.853841 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.853807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b913f2e-9481-4e91-9946-9d957f29ae31-error-404-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.853841 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.853845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn27j\" (UniqueName: \"kubernetes.io/projected/4b913f2e-9481-4e91-9946-9d957f29ae31-kube-api-access-xn27j\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.854389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.853911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b913f2e-9481-4e91-9946-9d957f29ae31-proxy-tls\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.854639 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.854618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b913f2e-9481-4e91-9946-9d957f29ae31-error-404-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.856344 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.856323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b913f2e-9481-4e91-9946-9d957f29ae31-proxy-tls\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.863919 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.863895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn27j\" (UniqueName: \"kubernetes.io/projected/4b913f2e-9481-4e91-9946-9d957f29ae31-kube-api-access-xn27j\") pod \"error-404-isvc-70f3d-predictor-568f669b8f-klg8v\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:47.994826 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:47.994787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:48.010173 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:48.010143 2577 generic.go:358] "Generic (PLEG): container finished" podID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerID="81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a" exitCode=2 Apr 22 18:56:48.010333 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:48.010188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" event={"ID":"eba18b2a-b04d-41df-a35c-7aded0b0ca0d","Type":"ContainerDied","Data":"81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a"} Apr 22 18:56:48.120182 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:48.120156 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v"] Apr 22 18:56:48.122956 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:56:48.122933 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b913f2e_9481_4e91_9946_9d957f29ae31.slice/crio-b072fb2b130dc996bf7316ac2fb5f6f9d32af659e3df182d922ce92c3196b370 WatchSource:0}: Error finding container b072fb2b130dc996bf7316ac2fb5f6f9d32af659e3df182d922ce92c3196b370: Status 404 returned error can't find the container with id b072fb2b130dc996bf7316ac2fb5f6f9d32af659e3df182d922ce92c3196b370 Apr 22 18:56:48.982985 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:48.982941 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:49.015291 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:49.015241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" event={"ID":"4b913f2e-9481-4e91-9946-9d957f29ae31","Type":"ContainerStarted","Data":"92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848"} Apr 22 18:56:49.015291 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:49.015293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" event={"ID":"4b913f2e-9481-4e91-9946-9d957f29ae31","Type":"ContainerStarted","Data":"a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01"} Apr 22 18:56:49.015502 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:49.015304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" event={"ID":"4b913f2e-9481-4e91-9946-9d957f29ae31","Type":"ContainerStarted","Data":"b072fb2b130dc996bf7316ac2fb5f6f9d32af659e3df182d922ce92c3196b370"} Apr 22 18:56:49.015502 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:49.015376 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:49.031030 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:49.030975 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podStartSLOduration=2.030958331 podStartE2EDuration="2.030958331s" podCreationTimestamp="2026-04-22 18:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:56:49.030161077 +0000 UTC m=+596.554824619" watchObservedRunningTime="2026-04-22 18:56:49.030958331 +0000 UTC m=+596.555621884" Apr 22 18:56:50.018553 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.018520 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:50.020050 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.020019 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 18:56:50.767897 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.767876 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:56:50.878147 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.878063 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv2g9\" (UniqueName: \"kubernetes.io/projected/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-kube-api-access-nv2g9\") pod \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " Apr 22 18:56:50.878147 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.878131 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-error-404-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " Apr 22 18:56:50.878370 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.878198 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls\") pod \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\" (UID: \"eba18b2a-b04d-41df-a35c-7aded0b0ca0d\") " Apr 22 18:56:50.878529 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.878505 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-error-404-isvc-9844b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-9844b-kube-rbac-proxy-sar-config") pod "eba18b2a-b04d-41df-a35c-7aded0b0ca0d" (UID: "eba18b2a-b04d-41df-a35c-7aded0b0ca0d"). InnerVolumeSpecName "error-404-isvc-9844b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:56:50.880435 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.880407 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-kube-api-access-nv2g9" (OuterVolumeSpecName: "kube-api-access-nv2g9") pod "eba18b2a-b04d-41df-a35c-7aded0b0ca0d" (UID: "eba18b2a-b04d-41df-a35c-7aded0b0ca0d"). InnerVolumeSpecName "kube-api-access-nv2g9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:50.880548 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.880407 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eba18b2a-b04d-41df-a35c-7aded0b0ca0d" (UID: "eba18b2a-b04d-41df-a35c-7aded0b0ca0d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:50.979050 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.979011 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:56:50.979050 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.979043 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nv2g9\" (UniqueName: \"kubernetes.io/projected/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-kube-api-access-nv2g9\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:56:50.979050 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:50.979056 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eba18b2a-b04d-41df-a35c-7aded0b0ca0d-error-404-isvc-9844b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:56:51.028948 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.028912 2577 generic.go:358] "Generic (PLEG): container finished" podID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerID="f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723" exitCode=0 Apr 22 18:56:51.029367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.028995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" event={"ID":"eba18b2a-b04d-41df-a35c-7aded0b0ca0d","Type":"ContainerDied","Data":"f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723"} Apr 22 18:56:51.029367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.029014 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" Apr 22 18:56:51.029367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.029032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59" event={"ID":"eba18b2a-b04d-41df-a35c-7aded0b0ca0d","Type":"ContainerDied","Data":"83ddc88a516a6f844a736bee691454741407cb06fd4f1c55a9d3034677520ee7"} Apr 22 18:56:51.029367 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.029048 2577 scope.go:117] "RemoveContainer" containerID="81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a" Apr 22 18:56:51.029731 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.029693 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 18:56:51.037483 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.037251 2577 scope.go:117] "RemoveContainer" containerID="f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723" Apr 22 18:56:51.044810 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.044788 2577 scope.go:117] "RemoveContainer" containerID="81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a" Apr 22 18:56:51.045088 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:56:51.045068 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a\": container with ID starting with 81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a not found: ID does not exist" containerID="81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a" Apr 22 18:56:51.045167 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.045098 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a"} err="failed to get container status \"81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a\": rpc error: code = NotFound desc = could not find container \"81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a\": container with ID starting with 81b34cd29fb86dd11f2d2db68337babd37e87c48dbc92afbf67bd6db69f7aa4a not found: ID does not exist" Apr 22 18:56:51.045167 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.045116 2577 scope.go:117] "RemoveContainer" containerID="f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723" Apr 22 18:56:51.045400 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:56:51.045380 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723\": container with ID starting with f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723 not found: ID does not exist" containerID="f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723" Apr 22 18:56:51.045471 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.045409 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723"} err="failed to get container status \"f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723\": rpc error: code = NotFound desc = could not find container \"f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723\": container with ID starting with f563aed328a25d9d69735d786c5196bc5b71c1b9fc289ac0d79afce9e3e86723 not found: ID does not exist" Apr 22 18:56:51.045594 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.045574 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59"] Apr 22 18:56:51.049442 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:51.049422 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9844b-predictor-57bf65897f-5ns59"] Apr 22 18:56:52.890205 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:52.890171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:56:52.890672 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:52.890604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 18:56:52.992258 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:52.992224 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" path="/var/lib/kubelet/pods/eba18b2a-b04d-41df-a35c-7aded0b0ca0d/volumes" Apr 22 18:56:53.982139 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:53.982094 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:56.035037 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:56.035012 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:56:56.035562 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:56.035535 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 18:56:58.981941 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:58.981897 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:58.982341 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:56:58.981994 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:57:03.982877 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:03.982830 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:06.035531 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:06.035486 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 18:57:08.982179 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:08.982138 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:13.248980 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.248940 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6"] Apr 22 18:57:13.249410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.249329 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" Apr 22 18:57:13.249410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.249341 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" Apr 22 18:57:13.249410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.249361 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kube-rbac-proxy" Apr 22 18:57:13.249410 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.249367 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kube-rbac-proxy" Apr 22 18:57:13.249559 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.249449 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kserve-container" Apr 22 18:57:13.249559 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.249463 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eba18b2a-b04d-41df-a35c-7aded0b0ca0d" containerName="kube-rbac-proxy" Apr 22 18:57:13.252426 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.252409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.254717 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.254694 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 22 18:57:13.254851 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.254836 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 22 18:57:13.258289 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.258251 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6"] Apr 22 18:57:13.374839 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.374805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-openshift-service-ca-bundle\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.374839 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.374839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.475791 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.475752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-openshift-service-ca-bundle\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.475791 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.475792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.476000 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:13.475904 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 22 18:57:13.476000 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:13.475965 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls podName:d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c nodeName:}" failed. No retries permitted until 2026-04-22 18:57:13.975949794 +0000 UTC m=+621.500613323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls") pod "model-chainer-76487779f8-7cgm6" (UID: "d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c") : secret "model-chainer-serving-cert" not found Apr 22 18:57:13.476424 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.476403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-openshift-service-ca-bundle\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.980201 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.980159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:13.980419 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:13.980354 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 22 18:57:13.980469 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:13.980432 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls podName:d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c nodeName:}" failed. No retries permitted until 2026-04-22 18:57:14.980412288 +0000 UTC m=+622.505075820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls") pod "model-chainer-76487779f8-7cgm6" (UID: "d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c") : secret "model-chainer-serving-cert" not found Apr 22 18:57:13.982203 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:13.982175 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:14.988742 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:14.988700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:14.991204 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:14.991177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") pod \"model-chainer-76487779f8-7cgm6\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:15.063882 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:15.063850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:15.186993 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:15.186964 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6"] Apr 22 18:57:15.189937 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:15.189917 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:16.036389 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:16.036346 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 18:57:16.118233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:16.118203 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" event={"ID":"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c","Type":"ContainerStarted","Data":"a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540"} Apr 22 18:57:16.118233 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:16.118236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" event={"ID":"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c","Type":"ContainerStarted","Data":"1396b8ffabe2a21d19ad5f84cb2ec028b3f5d67e1069235a390f8c9c47b5b645"} Apr 22 18:57:16.118493 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:16.118260 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:16.134473 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:16.134416 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podStartSLOduration=3.134402072 podStartE2EDuration="3.134402072s" podCreationTimestamp="2026-04-22 18:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:16.132324744 +0000 UTC m=+623.656988295" watchObservedRunningTime="2026-04-22 18:57:16.134402072 +0000 UTC m=+623.659065632" Apr 22 18:57:17.460064 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:17.460027 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a4cd2_d237_4f6a_b403_57164324e96c.slice/crio-1f29e0e4758d7d67ff6e5c51dd465aeae88a4a3e84ec12bf2bc527522959f4a7\": RecentStats: unable to find data in memory cache]" Apr 22 18:57:17.460452 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:17.460039 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a4cd2_d237_4f6a_b403_57164324e96c.slice/crio-conmon-77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4.scope\": RecentStats: unable to find data in memory cache]" Apr 22 18:57:17.460452 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:17.460244 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a4cd2_d237_4f6a_b403_57164324e96c.slice/crio-77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4.scope\": RecentStats: unable to find data in memory cache]" Apr 22 18:57:17.460452 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:17.460409 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a4cd2_d237_4f6a_b403_57164324e96c.slice/crio-conmon-77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a4cd2_d237_4f6a_b403_57164324e96c.slice/crio-77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4.scope\": RecentStats: unable to find data in memory cache]" Apr 22 18:57:17.595420 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.595387 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:57:17.712565 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.712535 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls\") pod \"d61a4cd2-d237-4f6a-b403-57164324e96c\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " Apr 22 18:57:17.712726 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.712606 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d61a4cd2-d237-4f6a-b403-57164324e96c-openshift-service-ca-bundle\") pod \"d61a4cd2-d237-4f6a-b403-57164324e96c\" (UID: \"d61a4cd2-d237-4f6a-b403-57164324e96c\") " Apr 22 18:57:17.712965 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.712928 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61a4cd2-d237-4f6a-b403-57164324e96c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d61a4cd2-d237-4f6a-b403-57164324e96c" (UID: "d61a4cd2-d237-4f6a-b403-57164324e96c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:17.714602 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.714580 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d61a4cd2-d237-4f6a-b403-57164324e96c" (UID: "d61a4cd2-d237-4f6a-b403-57164324e96c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:17.813529 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.813494 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d61a4cd2-d237-4f6a-b403-57164324e96c-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:17.813529 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:17.813523 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d61a4cd2-d237-4f6a-b403-57164324e96c-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:18.125856 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.125759 2577 generic.go:358] "Generic (PLEG): container finished" podID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerID="77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4" exitCode=0 Apr 22 18:57:18.125856 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.125824 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" Apr 22 18:57:18.125856 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.125843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" event={"ID":"d61a4cd2-d237-4f6a-b403-57164324e96c","Type":"ContainerDied","Data":"77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4"} Apr 22 18:57:18.126102 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.125882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl" event={"ID":"d61a4cd2-d237-4f6a-b403-57164324e96c","Type":"ContainerDied","Data":"1f29e0e4758d7d67ff6e5c51dd465aeae88a4a3e84ec12bf2bc527522959f4a7"} Apr 22 18:57:18.126102 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.125898 2577 scope.go:117] "RemoveContainer" containerID="77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4" Apr 22 18:57:18.135128 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.135106 2577 scope.go:117] "RemoveContainer" containerID="77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4" Apr 22 18:57:18.135393 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:18.135375 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4\": container with ID starting with 77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4 not found: ID does not exist" containerID="77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4" Apr 22 18:57:18.135485 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.135400 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4"} err="failed to get container status \"77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4\": rpc error: code = NotFound desc = could not find container \"77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4\": container with ID starting with 77f49379933527053fdf1d387d16258d0e8b078d0359b8efe1bf6acac99f71c4 not found: ID does not exist" Apr 22 18:57:18.146875 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.146849 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl"] Apr 22 18:57:18.152588 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.152563 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9844b-8475b5564f-rnmkl"] Apr 22 18:57:18.991042 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:18.991009 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" path="/var/lib/kubelet/pods/d61a4cd2-d237-4f6a-b403-57164324e96c/volumes" Apr 22 18:57:22.129118 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:22.129093 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:23.342244 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.342207 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6"] Apr 22 18:57:23.342670 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.342535 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" containerID="cri-o://a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540" gracePeriod=30 Apr 22 18:57:23.501604 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.501570 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558"] Apr 22 18:57:23.502308 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.502251 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" containerID="cri-o://2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee" gracePeriod=30 Apr 22 18:57:23.502449 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.502347 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kube-rbac-proxy" containerID="cri-o://894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947" gracePeriod=30 Apr 22 18:57:23.585883 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.585850 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj"] Apr 22 18:57:23.586223 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.586211 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" Apr 22 18:57:23.586287 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.586224 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" Apr 22 18:57:23.586336 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.586313 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d61a4cd2-d237-4f6a-b403-57164324e96c" containerName="switch-graph-9844b" Apr 22 18:57:23.590770 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.590750 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.593113 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.593054 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0e291-predictor-serving-cert\"" Apr 22 18:57:23.593216 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.593124 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0e291-kube-rbac-proxy-sar-config\"" Apr 22 18:57:23.600054 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.600024 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj"] Apr 22 18:57:23.769884 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.769849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkg8n\" (UniqueName: \"kubernetes.io/projected/d9c07e2c-f953-4b9e-8241-dbf9834c901f-kube-api-access-hkg8n\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.770053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.769914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9c07e2c-f953-4b9e-8241-dbf9834c901f-error-404-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.770053 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.769953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.871433 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.871358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.871593 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.871454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkg8n\" (UniqueName: \"kubernetes.io/projected/d9c07e2c-f953-4b9e-8241-dbf9834c901f-kube-api-access-hkg8n\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.871593 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:23.871505 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-0e291-predictor-serving-cert: secret "error-404-isvc-0e291-predictor-serving-cert" not found Apr 22 18:57:23.871593 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.871509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9c07e2c-f953-4b9e-8241-dbf9834c901f-error-404-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.871593 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:23.871576 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls podName:d9c07e2c-f953-4b9e-8241-dbf9834c901f nodeName:}" failed. No retries permitted until 2026-04-22 18:57:24.371559469 +0000 UTC m=+631.896222997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls") pod "error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" (UID: "d9c07e2c-f953-4b9e-8241-dbf9834c901f") : secret "error-404-isvc-0e291-predictor-serving-cert" not found Apr 22 18:57:23.872247 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.872228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9c07e2c-f953-4b9e-8241-dbf9834c901f-error-404-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:23.880292 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:23.880250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkg8n\" (UniqueName: \"kubernetes.io/projected/d9c07e2c-f953-4b9e-8241-dbf9834c901f-kube-api-access-hkg8n\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:24.148218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:24.148133 2577 generic.go:358] "Generic (PLEG): container finished" podID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerID="894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947" exitCode=2 Apr 22 18:57:24.148218 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:24.148191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerDied","Data":"894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947"} Apr 22 18:57:24.376310 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:24.376245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:24.378687 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:24.378668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls\") pod \"error-404-isvc-0e291-predictor-748c9cfd47-j5dmj\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:24.502881 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:24.502844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:24.626973 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:24.626951 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj"] Apr 22 18:57:24.628971 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:57:24.628935 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c07e2c_f953_4b9e_8241_dbf9834c901f.slice/crio-4b4005964b7bed77326c5331fefd0261c81cebfc80b60439f6c5dbd05adebf5a WatchSource:0}: Error finding container 4b4005964b7bed77326c5331fefd0261c81cebfc80b60439f6c5dbd05adebf5a: Status 404 returned error can't find the container with id 4b4005964b7bed77326c5331fefd0261c81cebfc80b60439f6c5dbd05adebf5a Apr 22 18:57:25.153432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:25.153342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" event={"ID":"d9c07e2c-f953-4b9e-8241-dbf9834c901f","Type":"ContainerStarted","Data":"0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f"} Apr 22 18:57:25.153432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:25.153381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" event={"ID":"d9c07e2c-f953-4b9e-8241-dbf9834c901f","Type":"ContainerStarted","Data":"72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3"} Apr 22 18:57:25.153432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:25.153394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" event={"ID":"d9c07e2c-f953-4b9e-8241-dbf9834c901f","Type":"ContainerStarted","Data":"4b4005964b7bed77326c5331fefd0261c81cebfc80b60439f6c5dbd05adebf5a"} Apr 22 18:57:25.153694 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:25.153466 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:25.171402 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:25.171357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podStartSLOduration=2.171342937 podStartE2EDuration="2.171342937s" podCreationTimestamp="2026-04-22 18:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:25.16887046 +0000 UTC m=+632.693534011" watchObservedRunningTime="2026-04-22 18:57:25.171342937 +0000 UTC m=+632.696006487" Apr 22 18:57:26.035772 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:26.035735 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 18:57:26.159604 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:26.159562 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:26.160526 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:26.160501 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 18:57:26.778888 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:26.778844 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 22 18:57:26.783417 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:26.783385 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 18:57:27.126037 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:27.125947 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:27.163426 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:27.163390 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 18:57:28.352418 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.352392 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:57:28.512199 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.512164 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kserve-provision-location\") pod \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " Apr 22 18:57:28.512409 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.512300 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-proxy-tls\") pod \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " Apr 22 18:57:28.512409 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.512342 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " Apr 22 18:57:28.512409 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.512359 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcmf5\" (UniqueName: \"kubernetes.io/projected/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kube-api-access-pcmf5\") pod \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\" (UID: \"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53\") " Apr 22 18:57:28.512590 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.512558 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" (UID: "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:28.512729 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.512694 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" (UID: "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:28.514402 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.514379 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kube-api-access-pcmf5" (OuterVolumeSpecName: "kube-api-access-pcmf5") pod "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" (UID: "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53"). InnerVolumeSpecName "kube-api-access-pcmf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:28.514486 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.514424 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" (UID: "f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:28.613577 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.613539 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:28.613577 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.613575 2577 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:28.613787 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.613589 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pcmf5\" (UniqueName: \"kubernetes.io/projected/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kube-api-access-pcmf5\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:28.613787 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:28.613604 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53-kserve-provision-location\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:29.170889 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.170856 2577 generic.go:358] "Generic (PLEG): container finished" podID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerID="2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee" exitCode=0 Apr 22 18:57:29.171046 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.170946 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" Apr 22 18:57:29.171046 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.170945 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerDied","Data":"2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee"} Apr 22 18:57:29.171046 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.170991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558" event={"ID":"f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53","Type":"ContainerDied","Data":"328cc9104b201f0ae9280f38e849a82e93f13678140bdfac475d2378663e3091"} Apr 22 18:57:29.171046 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.171011 2577 scope.go:117] "RemoveContainer" containerID="894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947" Apr 22 18:57:29.178938 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.178914 2577 scope.go:117] "RemoveContainer" containerID="2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee" Apr 22 18:57:29.185995 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.185973 2577 scope.go:117] "RemoveContainer" containerID="d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713" Apr 22 18:57:29.189143 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.189122 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558"] Apr 22 18:57:29.191909 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.191889 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-8q558"] Apr 22 18:57:29.195441 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.195411 2577 scope.go:117] "RemoveContainer" containerID="894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947" Apr 22 18:57:29.195745 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:29.195721 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947\": container with ID starting with 894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947 not found: ID does not exist" containerID="894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947" Apr 22 18:57:29.195830 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.195752 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947"} err="failed to get container status \"894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947\": rpc error: code = NotFound desc = could not find container \"894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947\": container with ID starting with 894c7382f211270a32cb87c9d5ec8cb31daf513a76fa644eeb0d7454c0f56947 not found: ID does not exist" Apr 22 18:57:29.195830 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.195781 2577 scope.go:117] "RemoveContainer" containerID="2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee" Apr 22 18:57:29.196063 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:29.196046 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee\": container with ID starting with 2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee not found: ID does not exist" containerID="2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee" Apr 22 18:57:29.196103 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.196070 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee"} err="failed to get container status \"2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee\": rpc error: code = NotFound desc = could not find container \"2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee\": container with ID starting with 2f4b8d192166490588dc5f8fd83df1f44f27bb06ccf48977bba33949ecd556ee not found: ID does not exist" Apr 22 18:57:29.196103 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.196086 2577 scope.go:117] "RemoveContainer" containerID="d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713" Apr 22 18:57:29.196370 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:29.196354 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713\": container with ID starting with d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713 not found: ID does not exist" containerID="d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713" Apr 22 18:57:29.196432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:29.196372 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713"} err="failed to get container status \"d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713\": rpc error: code = NotFound desc = could not find container \"d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713\": container with ID starting with d82cb3f73647c41c125d361e4330e097fff844eee2abc3f999e1c1e7d415f713 not found: ID does not exist" Apr 22 18:57:30.991916 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:30.991879 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" path="/var/lib/kubelet/pods/f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53/volumes" Apr 22 18:57:32.126188 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:32.126143 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:32.167862 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:32.167836 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:57:32.168424 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:32.168392 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 18:57:36.036405 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:36.036375 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 18:57:37.126027 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:37.125979 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:37.126432 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:37.126084 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:42.125744 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:42.125694 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:42.168703 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:42.168667 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 18:57:47.126313 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.126249 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:47.648723 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.648690 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj"] Apr 22 18:57:47.649086 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649074 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" Apr 22 18:57:47.649086 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649087 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" Apr 22 18:57:47.649181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649098 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="storage-initializer" Apr 22 18:57:47.649181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649104 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="storage-initializer" Apr 22 18:57:47.649181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649112 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kube-rbac-proxy" Apr 22 18:57:47.649181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649117 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kube-rbac-proxy" Apr 22 18:57:47.649181 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649181 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kube-rbac-proxy" Apr 22 18:57:47.649393 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.649189 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9ca8ccb-2c94-4c3f-9d22-4cfaed4bde53" containerName="kserve-container" Apr 22 18:57:47.652210 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.652190 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.654550 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.654531 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-70f3d-kube-rbac-proxy-sar-config\"" Apr 22 18:57:47.654615 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.654531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-70f3d-serving-cert\"" Apr 22 18:57:47.662061 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.662037 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj"] Apr 22 18:57:47.782675 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.782641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d659fae6-80e2-4e52-8641-98e62d0721f2-openshift-service-ca-bundle\") pod \"switch-graph-70f3d-6496fb9568-5tskj\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.782675 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.782687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d659fae6-80e2-4e52-8641-98e62d0721f2-proxy-tls\") pod \"switch-graph-70f3d-6496fb9568-5tskj\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.883875 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.883841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d659fae6-80e2-4e52-8641-98e62d0721f2-openshift-service-ca-bundle\") pod \"switch-graph-70f3d-6496fb9568-5tskj\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.883875 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.883879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d659fae6-80e2-4e52-8641-98e62d0721f2-proxy-tls\") pod \"switch-graph-70f3d-6496fb9568-5tskj\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.884504 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.884472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d659fae6-80e2-4e52-8641-98e62d0721f2-openshift-service-ca-bundle\") pod \"switch-graph-70f3d-6496fb9568-5tskj\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.886243 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.886223 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d659fae6-80e2-4e52-8641-98e62d0721f2-proxy-tls\") pod \"switch-graph-70f3d-6496fb9568-5tskj\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:47.963327 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:47.963296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:48.087461 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:48.087428 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj"] Apr 22 18:57:48.089553 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:57:48.089525 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd659fae6_80e2_4e52_8641_98e62d0721f2.slice/crio-36e4550e74c89c3807d76cfbadc152268e46ff2c2568572f1ad4935dd069214b WatchSource:0}: Error finding container 36e4550e74c89c3807d76cfbadc152268e46ff2c2568572f1ad4935dd069214b: Status 404 returned error can't find the container with id 36e4550e74c89c3807d76cfbadc152268e46ff2c2568572f1ad4935dd069214b Apr 22 18:57:48.236990 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:48.236901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" event={"ID":"d659fae6-80e2-4e52-8641-98e62d0721f2","Type":"ContainerStarted","Data":"7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869"} Apr 22 18:57:48.236990 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:48.236944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" event={"ID":"d659fae6-80e2-4e52-8641-98e62d0721f2","Type":"ContainerStarted","Data":"36e4550e74c89c3807d76cfbadc152268e46ff2c2568572f1ad4935dd069214b"} Apr 22 18:57:48.237443 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:48.237025 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:48.253185 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:48.253139 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podStartSLOduration=1.253126842 podStartE2EDuration="1.253126842s" podCreationTimestamp="2026-04-22 18:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:48.251135036 +0000 UTC m=+655.775798628" watchObservedRunningTime="2026-04-22 18:57:48.253126842 +0000 UTC m=+655.777790393" Apr 22 18:57:52.126475 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:52.126436 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:52.169046 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:52.169007 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 18:57:53.509691 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.509658 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:53.637609 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.637534 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-openshift-service-ca-bundle\") pod \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " Apr 22 18:57:53.637609 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.637605 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") pod \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\" (UID: \"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c\") " Apr 22 18:57:53.637947 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.637922 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" (UID: "d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:53.639643 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.639606 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" (UID: "d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:53.738211 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.738180 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:53.738211 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:53.738209 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 18:57:54.246460 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.246427 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 18:57:54.258836 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.258808 2577 generic.go:358] "Generic (PLEG): container finished" podID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerID="a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540" exitCode=0 Apr 22 18:57:54.258979 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.258898 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" Apr 22 18:57:54.258979 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.258896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" event={"ID":"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c","Type":"ContainerDied","Data":"a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540"} Apr 22 18:57:54.258979 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.258942 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6" event={"ID":"d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c","Type":"ContainerDied","Data":"1396b8ffabe2a21d19ad5f84cb2ec028b3f5d67e1069235a390f8c9c47b5b645"} Apr 22 18:57:54.258979 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.258963 2577 scope.go:117] "RemoveContainer" containerID="a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540" Apr 22 18:57:54.269611 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.269589 2577 scope.go:117] "RemoveContainer" containerID="a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540" Apr 22 18:57:54.270181 ip-10-0-143-56 kubenswrapper[2577]: E0422 18:57:54.270157 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540\": container with ID starting with a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540 not found: ID does not exist" containerID="a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540" Apr 22 18:57:54.270357 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.270331 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540"} err="failed to get container status \"a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540\": rpc error: code = NotFound desc = could not find container \"a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540\": container with ID starting with a4e93b19ddadfde2140d81ea981463293d32999eedf0e5daddf3587622022540 not found: ID does not exist" Apr 22 18:57:54.283120 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.283098 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6"] Apr 22 18:57:54.285845 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.285822 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-7cgm6"] Apr 22 18:57:54.991828 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:57:54.991788 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" path="/var/lib/kubelet/pods/d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c/volumes" Apr 22 18:58:02.168694 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:02.168653 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 18:58:12.169859 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:12.169830 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 18:58:23.508509 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.508473 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j"] Apr 22 18:58:23.509012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.508850 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" Apr 22 18:58:23.509012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.508861 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" Apr 22 18:58:23.509012 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.508917 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9d05e4a-c6f2-48c8-a88e-2816e26a3a9c" containerName="model-chainer" Apr 22 18:58:23.511782 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.511765 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.514216 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.514193 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-0e291-kube-rbac-proxy-sar-config\"" Apr 22 18:58:23.514334 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.514235 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-0e291-serving-cert\"" Apr 22 18:58:23.520226 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.520202 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j"] Apr 22 18:58:23.616973 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.616933 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b81c6a1-d055-4e93-91a8-786d4c7267e3-openshift-service-ca-bundle\") pod \"sequence-graph-0e291-767bbd5cdd-qwg5j\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.617137 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.616993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b81c6a1-d055-4e93-91a8-786d4c7267e3-proxy-tls\") pod \"sequence-graph-0e291-767bbd5cdd-qwg5j\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.717776 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.717741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b81c6a1-d055-4e93-91a8-786d4c7267e3-proxy-tls\") pod \"sequence-graph-0e291-767bbd5cdd-qwg5j\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.717967 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.717828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b81c6a1-d055-4e93-91a8-786d4c7267e3-openshift-service-ca-bundle\") pod \"sequence-graph-0e291-767bbd5cdd-qwg5j\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.718695 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.718669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b81c6a1-d055-4e93-91a8-786d4c7267e3-openshift-service-ca-bundle\") pod \"sequence-graph-0e291-767bbd5cdd-qwg5j\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.720060 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.720029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b81c6a1-d055-4e93-91a8-786d4c7267e3-proxy-tls\") pod \"sequence-graph-0e291-767bbd5cdd-qwg5j\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.823126 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.823044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:23.944571 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:23.944547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j"] Apr 22 18:58:23.946736 ip-10-0-143-56 kubenswrapper[2577]: W0422 18:58:23.946710 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b81c6a1_d055_4e93_91a8_786d4c7267e3.slice/crio-60b9b4e89e83118ba27940ad52bd11c7150c65ef65fe07cc0bdadfe9d54b8c6c WatchSource:0}: Error finding container 60b9b4e89e83118ba27940ad52bd11c7150c65ef65fe07cc0bdadfe9d54b8c6c: Status 404 returned error can't find the container with id 60b9b4e89e83118ba27940ad52bd11c7150c65ef65fe07cc0bdadfe9d54b8c6c Apr 22 18:58:24.372724 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:24.372688 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" event={"ID":"2b81c6a1-d055-4e93-91a8-786d4c7267e3","Type":"ContainerStarted","Data":"253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952"} Apr 22 18:58:24.372724 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:24.372729 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" event={"ID":"2b81c6a1-d055-4e93-91a8-786d4c7267e3","Type":"ContainerStarted","Data":"60b9b4e89e83118ba27940ad52bd11c7150c65ef65fe07cc0bdadfe9d54b8c6c"} Apr 22 18:58:24.372942 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:24.372859 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 18:58:24.389417 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:24.389379 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podStartSLOduration=1.38936778 podStartE2EDuration="1.38936778s" podCreationTimestamp="2026-04-22 18:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:58:24.387750398 +0000 UTC m=+691.912413950" watchObservedRunningTime="2026-04-22 18:58:24.38936778 +0000 UTC m=+691.914031330" Apr 22 18:58:30.382173 ip-10-0-143-56 kubenswrapper[2577]: I0422 18:58:30.382142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 19:01:52.914984 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:01:52.914955 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:01:52.917242 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:01:52.917219 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:06:02.369043 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.369005 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj"] Apr 22 19:06:02.371663 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.369339 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" containerID="cri-o://7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869" gracePeriod=30 Apr 22 19:06:02.554985 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.554949 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v"] Apr 22 19:06:02.555415 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.555380 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" containerID="cri-o://a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01" gracePeriod=30 Apr 22 19:06:02.555589 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.555406 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kube-rbac-proxy" containerID="cri-o://92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848" gracePeriod=30 Apr 22 19:06:02.601843 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.601817 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4"] Apr 22 19:06:02.605498 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.605480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.607804 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.607785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-50a5e-predictor-serving-cert\"" Apr 22 19:06:02.607888 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.607831 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-50a5e-kube-rbac-proxy-sar-config\"" Apr 22 19:06:02.612452 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.612431 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4"] Apr 22 19:06:02.679709 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.679682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2j7p\" (UniqueName: \"kubernetes.io/projected/a321ae25-0e38-4211-b254-595a1170bd90-kube-api-access-t2j7p\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.679842 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.679789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.679899 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.679853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a321ae25-0e38-4211-b254-595a1170bd90-error-404-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.781125 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.781089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.781336 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.781161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a321ae25-0e38-4211-b254-595a1170bd90-error-404-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.781336 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.781202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2j7p\" (UniqueName: \"kubernetes.io/projected/a321ae25-0e38-4211-b254-595a1170bd90-kube-api-access-t2j7p\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.781336 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:02.781237 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-serving-cert: secret "error-404-isvc-50a5e-predictor-serving-cert" not found Apr 22 19:06:02.781336 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:02.781331 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls podName:a321ae25-0e38-4211-b254-595a1170bd90 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:03.281310544 +0000 UTC m=+1150.805974081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls") pod "error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" (UID: "a321ae25-0e38-4211-b254-595a1170bd90") : secret "error-404-isvc-50a5e-predictor-serving-cert" not found Apr 22 19:06:02.781822 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.781803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a321ae25-0e38-4211-b254-595a1170bd90-error-404-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.790699 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.790675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2j7p\" (UniqueName: \"kubernetes.io/projected/a321ae25-0e38-4211-b254-595a1170bd90-kube-api-access-t2j7p\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:02.934037 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.933951 2577 generic.go:358] "Generic (PLEG): container finished" podID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerID="92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848" exitCode=2 Apr 22 19:06:02.934037 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:02.934019 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" event={"ID":"4b913f2e-9481-4e91-9946-9d957f29ae31","Type":"ContainerDied","Data":"92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848"} Apr 22 19:06:03.286856 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.286822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:03.289128 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.289105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls\") pod \"error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:03.518047 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.518013 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:03.644488 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.644464 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4"] Apr 22 19:06:03.646133 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:06:03.646084 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda321ae25_0e38_4211_b254_595a1170bd90.slice/crio-388096663fa3e6fc56139ff465a1de5c8556abad20a8951b29af658430b0ef61 WatchSource:0}: Error finding container 388096663fa3e6fc56139ff465a1de5c8556abad20a8951b29af658430b0ef61: Status 404 returned error can't find the container with id 388096663fa3e6fc56139ff465a1de5c8556abad20a8951b29af658430b0ef61 Apr 22 19:06:03.647991 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.647970 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:06:03.939908 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.939820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" event={"ID":"a321ae25-0e38-4211-b254-595a1170bd90","Type":"ContainerStarted","Data":"1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea"} Apr 22 19:06:03.939908 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.939856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" event={"ID":"a321ae25-0e38-4211-b254-595a1170bd90","Type":"ContainerStarted","Data":"79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5"} Apr 22 19:06:03.939908 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.939867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" event={"ID":"a321ae25-0e38-4211-b254-595a1170bd90","Type":"ContainerStarted","Data":"388096663fa3e6fc56139ff465a1de5c8556abad20a8951b29af658430b0ef61"} Apr 22 19:06:03.940195 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.939954 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:03.958969 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:03.958921 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podStartSLOduration=1.958905563 podStartE2EDuration="1.958905563s" podCreationTimestamp="2026-04-22 19:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:03.956763269 +0000 UTC m=+1151.481426819" watchObservedRunningTime="2026-04-22 19:06:03.958905563 +0000 UTC m=+1151.483569114" Apr 22 19:06:04.244484 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:04.244447 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:04.942774 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:04.942740 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:04.944258 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:04.944227 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:06:05.811441 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.811413 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 19:06:05.909585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.909495 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b913f2e-9481-4e91-9946-9d957f29ae31-proxy-tls\") pod \"4b913f2e-9481-4e91-9946-9d957f29ae31\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " Apr 22 19:06:05.909585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.909565 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b913f2e-9481-4e91-9946-9d957f29ae31-error-404-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"4b913f2e-9481-4e91-9946-9d957f29ae31\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " Apr 22 19:06:05.909830 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.909594 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn27j\" (UniqueName: \"kubernetes.io/projected/4b913f2e-9481-4e91-9946-9d957f29ae31-kube-api-access-xn27j\") pod \"4b913f2e-9481-4e91-9946-9d957f29ae31\" (UID: \"4b913f2e-9481-4e91-9946-9d957f29ae31\") " Apr 22 19:06:05.909936 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.909911 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b913f2e-9481-4e91-9946-9d957f29ae31-error-404-isvc-70f3d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-70f3d-kube-rbac-proxy-sar-config") pod "4b913f2e-9481-4e91-9946-9d957f29ae31" (UID: "4b913f2e-9481-4e91-9946-9d957f29ae31"). InnerVolumeSpecName "error-404-isvc-70f3d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:06:05.911617 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.911585 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b913f2e-9481-4e91-9946-9d957f29ae31-kube-api-access-xn27j" (OuterVolumeSpecName: "kube-api-access-xn27j") pod "4b913f2e-9481-4e91-9946-9d957f29ae31" (UID: "4b913f2e-9481-4e91-9946-9d957f29ae31"). InnerVolumeSpecName "kube-api-access-xn27j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:05.911720 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.911616 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b913f2e-9481-4e91-9946-9d957f29ae31-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4b913f2e-9481-4e91-9946-9d957f29ae31" (UID: "4b913f2e-9481-4e91-9946-9d957f29ae31"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:05.947360 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.947331 2577 generic.go:358] "Generic (PLEG): container finished" podID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerID="a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01" exitCode=0 Apr 22 19:06:05.947754 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.947418 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" Apr 22 19:06:05.947754 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.947422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" event={"ID":"4b913f2e-9481-4e91-9946-9d957f29ae31","Type":"ContainerDied","Data":"a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01"} Apr 22 19:06:05.947754 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.947461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v" event={"ID":"4b913f2e-9481-4e91-9946-9d957f29ae31","Type":"ContainerDied","Data":"b072fb2b130dc996bf7316ac2fb5f6f9d32af659e3df182d922ce92c3196b370"} Apr 22 19:06:05.947754 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.947477 2577 scope.go:117] "RemoveContainer" containerID="92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848" Apr 22 19:06:05.948156 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.948121 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:06:05.955750 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.955734 2577 scope.go:117] "RemoveContainer" containerID="a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01" Apr 22 19:06:05.962803 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.962786 2577 scope.go:117] "RemoveContainer" containerID="92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848" Apr 22 19:06:05.963060 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:05.963044 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848\": container with ID starting with 92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848 not found: ID does not exist" containerID="92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848" Apr 22 19:06:05.963104 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.963067 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848"} err="failed to get container status \"92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848\": rpc error: code = NotFound desc = could not find container \"92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848\": container with ID starting with 92f99d9aafa4d61c2bb08e69130699503fbe71a1a5a436a6ebcd6071b0711848 not found: ID does not exist" Apr 22 19:06:05.963104 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.963083 2577 scope.go:117] "RemoveContainer" containerID="a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01" Apr 22 19:06:05.963335 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:05.963314 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01\": container with ID starting with a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01 not found: ID does not exist" containerID="a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01" Apr 22 19:06:05.963391 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.963343 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01"} err="failed to get container status \"a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01\": rpc error: code = NotFound desc = could not find container \"a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01\": container with ID starting with a6cb2d7d46e50ef420549a617e0bc92c292c64bc2cf0975cad2bf708838e5f01 not found: ID does not exist" Apr 22 19:06:05.967774 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.967752 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v"] Apr 22 19:06:05.971296 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:05.971262 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-70f3d-predictor-568f669b8f-klg8v"] Apr 22 19:06:06.010877 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:06.010847 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b913f2e-9481-4e91-9946-9d957f29ae31-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:06.010877 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:06.010878 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b913f2e-9481-4e91-9946-9d957f29ae31-error-404-isvc-70f3d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:06.011081 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:06.010892 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xn27j\" (UniqueName: \"kubernetes.io/projected/4b913f2e-9481-4e91-9946-9d957f29ae31-kube-api-access-xn27j\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:06.991150 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:06.991114 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" path="/var/lib/kubelet/pods/4b913f2e-9481-4e91-9946-9d957f29ae31/volumes" Apr 22 19:06:09.244857 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:09.244817 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:10.951717 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:10.951686 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:10.952146 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:10.952090 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:06:14.245494 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:14.245454 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:14.245900 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:14.245578 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 19:06:19.245120 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:19.245067 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:20.952872 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:20.952827 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:06:24.245492 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:24.245451 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:29.245081 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:29.245041 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:30.952499 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:30.952456 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:06:32.519642 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.519619 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 19:06:32.640965 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.640872 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d659fae6-80e2-4e52-8641-98e62d0721f2-openshift-service-ca-bundle\") pod \"d659fae6-80e2-4e52-8641-98e62d0721f2\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " Apr 22 19:06:32.640965 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.640934 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d659fae6-80e2-4e52-8641-98e62d0721f2-proxy-tls\") pod \"d659fae6-80e2-4e52-8641-98e62d0721f2\" (UID: \"d659fae6-80e2-4e52-8641-98e62d0721f2\") " Apr 22 19:06:32.641329 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.641302 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d659fae6-80e2-4e52-8641-98e62d0721f2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d659fae6-80e2-4e52-8641-98e62d0721f2" (UID: "d659fae6-80e2-4e52-8641-98e62d0721f2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:06:32.642972 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.642951 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d659fae6-80e2-4e52-8641-98e62d0721f2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d659fae6-80e2-4e52-8641-98e62d0721f2" (UID: "d659fae6-80e2-4e52-8641-98e62d0721f2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:32.742060 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.742021 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d659fae6-80e2-4e52-8641-98e62d0721f2-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:32.742060 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:32.742055 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d659fae6-80e2-4e52-8641-98e62d0721f2-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:33.037841 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.037809 2577 generic.go:358] "Generic (PLEG): container finished" podID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerID="7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869" exitCode=0 Apr 22 19:06:33.038011 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.037889 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" Apr 22 19:06:33.038011 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.037892 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" event={"ID":"d659fae6-80e2-4e52-8641-98e62d0721f2","Type":"ContainerDied","Data":"7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869"} Apr 22 19:06:33.038011 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.037936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj" event={"ID":"d659fae6-80e2-4e52-8641-98e62d0721f2","Type":"ContainerDied","Data":"36e4550e74c89c3807d76cfbadc152268e46ff2c2568572f1ad4935dd069214b"} Apr 22 19:06:33.038011 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.037956 2577 scope.go:117] "RemoveContainer" containerID="7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869" Apr 22 19:06:33.045984 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.045968 2577 scope.go:117] "RemoveContainer" containerID="7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869" Apr 22 19:06:33.046221 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:33.046201 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869\": container with ID starting with 7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869 not found: ID does not exist" containerID="7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869" Apr 22 19:06:33.046312 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.046231 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869"} err="failed to get container status \"7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869\": rpc error: code = NotFound desc = could not find container \"7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869\": container with ID starting with 7deeb02940bf93a3d2d77ffa227120b599dc4928277f5df19543c5e66f344869 not found: ID does not exist" Apr 22 19:06:33.053791 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.053769 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj"] Apr 22 19:06:33.056322 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:33.056301 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-70f3d-6496fb9568-5tskj"] Apr 22 19:06:34.991588 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:34.991557 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" path="/var/lib/kubelet/pods/d659fae6-80e2-4e52-8641-98e62d0721f2/volumes" Apr 22 19:06:38.293571 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.293534 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j"] Apr 22 19:06:38.294036 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.293828 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" containerID="cri-o://253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952" gracePeriod=30 Apr 22 19:06:38.472090 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.472053 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj"] Apr 22 19:06:38.472620 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.472490 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" containerID="cri-o://72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3" gracePeriod=30 Apr 22 19:06:38.472620 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.472525 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kube-rbac-proxy" containerID="cri-o://0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f" gracePeriod=30 Apr 22 19:06:38.522444 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522412 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96"] Apr 22 19:06:38.522911 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522885 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kube-rbac-proxy" Apr 22 19:06:38.522911 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522907 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kube-rbac-proxy" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522921 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522930 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522949 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.522955 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.523017 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kube-rbac-proxy" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.523030 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b913f2e-9481-4e91-9946-9d957f29ae31" containerName="kserve-container" Apr 22 19:06:38.523078 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.523041 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d659fae6-80e2-4e52-8641-98e62d0721f2" containerName="switch-graph-70f3d" Apr 22 19:06:38.527583 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.527564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.529973 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.529953 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-23058-predictor-serving-cert\"" Apr 22 19:06:38.529973 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.529968 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-23058-kube-rbac-proxy-sar-config\"" Apr 22 19:06:38.541529 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.535515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96"] Apr 22 19:06:38.594772 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.594739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27f5dc48-f149-4b40-a291-41d34033c8d2-error-404-isvc-23058-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.594911 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.594801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.594911 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.594891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmvk\" (UniqueName: \"kubernetes.io/projected/27f5dc48-f149-4b40-a291-41d34033c8d2-kube-api-access-xwmvk\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.696450 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.696415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27f5dc48-f149-4b40-a291-41d34033c8d2-error-404-isvc-23058-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.696641 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.696478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.696641 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.696511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmvk\" (UniqueName: \"kubernetes.io/projected/27f5dc48-f149-4b40-a291-41d34033c8d2-kube-api-access-xwmvk\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.696641 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:38.696634 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-23058-predictor-serving-cert: secret "error-404-isvc-23058-predictor-serving-cert" not found Apr 22 19:06:38.696862 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:38.696717 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls podName:27f5dc48-f149-4b40-a291-41d34033c8d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:39.196696391 +0000 UTC m=+1186.721359943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls") pod "error-404-isvc-23058-predictor-7f696595cc-bzs96" (UID: "27f5dc48-f149-4b40-a291-41d34033c8d2") : secret "error-404-isvc-23058-predictor-serving-cert" not found Apr 22 19:06:38.697149 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.697124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27f5dc48-f149-4b40-a291-41d34033c8d2-error-404-isvc-23058-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:38.705225 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:38.705203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmvk\" (UniqueName: \"kubernetes.io/projected/27f5dc48-f149-4b40-a291-41d34033c8d2-kube-api-access-xwmvk\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:39.059410 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:39.059376 2577 generic.go:358] "Generic (PLEG): container finished" podID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerID="0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f" exitCode=2 Apr 22 19:06:39.059585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:39.059445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" event={"ID":"d9c07e2c-f953-4b9e-8241-dbf9834c901f","Type":"ContainerDied","Data":"0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f"} Apr 22 19:06:39.201960 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:39.201916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:39.204260 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:39.204236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls\") pod \"error-404-isvc-23058-predictor-7f696595cc-bzs96\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:39.446448 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:39.446361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:39.572936 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:39.572904 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96"] Apr 22 19:06:39.576716 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:06:39.576691 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27f5dc48_f149_4b40_a291_41d34033c8d2.slice/crio-8f44a8d391ba7cbb4e74a9c59bb271f17113eb2a991ecfc873aae59cec9654a9 WatchSource:0}: Error finding container 8f44a8d391ba7cbb4e74a9c59bb271f17113eb2a991ecfc873aae59cec9654a9: Status 404 returned error can't find the container with id 8f44a8d391ba7cbb4e74a9c59bb271f17113eb2a991ecfc873aae59cec9654a9 Apr 22 19:06:40.065253 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.065220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" event={"ID":"27f5dc48-f149-4b40-a291-41d34033c8d2","Type":"ContainerStarted","Data":"5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3"} Apr 22 19:06:40.065253 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.065255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" event={"ID":"27f5dc48-f149-4b40-a291-41d34033c8d2","Type":"ContainerStarted","Data":"51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248"} Apr 22 19:06:40.065487 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.065284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" event={"ID":"27f5dc48-f149-4b40-a291-41d34033c8d2","Type":"ContainerStarted","Data":"8f44a8d391ba7cbb4e74a9c59bb271f17113eb2a991ecfc873aae59cec9654a9"} Apr 22 19:06:40.065487 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.065366 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:40.085630 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.085565 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podStartSLOduration=2.085551791 podStartE2EDuration="2.085551791s" podCreationTimestamp="2026-04-22 19:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:40.084092818 +0000 UTC m=+1187.608756368" watchObservedRunningTime="2026-04-22 19:06:40.085551791 +0000 UTC m=+1187.610215338" Apr 22 19:06:40.380168 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.380080 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:40.952517 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:40.952474 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:06:41.068652 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.068618 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:41.069723 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.069694 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 19:06:41.625110 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.625081 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 19:06:41.724584 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.724554 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkg8n\" (UniqueName: \"kubernetes.io/projected/d9c07e2c-f953-4b9e-8241-dbf9834c901f-kube-api-access-hkg8n\") pod \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " Apr 22 19:06:41.724728 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.724595 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9c07e2c-f953-4b9e-8241-dbf9834c901f-error-404-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " Apr 22 19:06:41.724728 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.724656 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls\") pod \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\" (UID: \"d9c07e2c-f953-4b9e-8241-dbf9834c901f\") " Apr 22 19:06:41.724996 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.724962 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9c07e2c-f953-4b9e-8241-dbf9834c901f-error-404-isvc-0e291-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-0e291-kube-rbac-proxy-sar-config") pod "d9c07e2c-f953-4b9e-8241-dbf9834c901f" (UID: "d9c07e2c-f953-4b9e-8241-dbf9834c901f"). InnerVolumeSpecName "error-404-isvc-0e291-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:06:41.726715 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.726685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c07e2c-f953-4b9e-8241-dbf9834c901f-kube-api-access-hkg8n" (OuterVolumeSpecName: "kube-api-access-hkg8n") pod "d9c07e2c-f953-4b9e-8241-dbf9834c901f" (UID: "d9c07e2c-f953-4b9e-8241-dbf9834c901f"). InnerVolumeSpecName "kube-api-access-hkg8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:41.726813 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.726724 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d9c07e2c-f953-4b9e-8241-dbf9834c901f" (UID: "d9c07e2c-f953-4b9e-8241-dbf9834c901f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:41.825692 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.825654 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkg8n\" (UniqueName: \"kubernetes.io/projected/d9c07e2c-f953-4b9e-8241-dbf9834c901f-kube-api-access-hkg8n\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:41.825692 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.825685 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9c07e2c-f953-4b9e-8241-dbf9834c901f-error-404-isvc-0e291-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:41.825692 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:41.825696 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9c07e2c-f953-4b9e-8241-dbf9834c901f-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:06:42.073477 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.073383 2577 generic.go:358] "Generic (PLEG): container finished" podID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerID="72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3" exitCode=0 Apr 22 19:06:42.073477 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.073453 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" Apr 22 19:06:42.074002 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.073482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" event={"ID":"d9c07e2c-f953-4b9e-8241-dbf9834c901f","Type":"ContainerDied","Data":"72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3"} Apr 22 19:06:42.074002 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.073529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj" event={"ID":"d9c07e2c-f953-4b9e-8241-dbf9834c901f","Type":"ContainerDied","Data":"4b4005964b7bed77326c5331fefd0261c81cebfc80b60439f6c5dbd05adebf5a"} Apr 22 19:06:42.074002 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.073550 2577 scope.go:117] "RemoveContainer" containerID="0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f" Apr 22 19:06:42.074182 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.074158 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 19:06:42.082361 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.082342 2577 scope.go:117] "RemoveContainer" containerID="72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3" Apr 22 19:06:42.089770 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.089753 2577 scope.go:117] "RemoveContainer" containerID="0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f" Apr 22 19:06:42.090005 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:42.089986 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f\": container with ID starting with 0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f not found: ID does not exist" containerID="0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f" Apr 22 19:06:42.090082 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.090019 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f"} err="failed to get container status \"0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f\": rpc error: code = NotFound desc = could not find container \"0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f\": container with ID starting with 0a6959b05048f88dfab2003e349865e9ef17c3c6d9cccc4aa307d7e4f026db2f not found: ID does not exist" Apr 22 19:06:42.090082 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.090043 2577 scope.go:117] "RemoveContainer" containerID="72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3" Apr 22 19:06:42.090262 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:06:42.090247 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3\": container with ID starting with 72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3 not found: ID does not exist" containerID="72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3" Apr 22 19:06:42.090330 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.090283 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3"} err="failed to get container status \"72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3\": rpc error: code = NotFound desc = could not find container \"72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3\": container with ID starting with 72cc61f77a5fb46e189d8bd8989bddfc1da9a6780ccc9dfcb2d9c51c4a9cb3c3 not found: ID does not exist" Apr 22 19:06:42.108959 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.108930 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj"] Apr 22 19:06:42.119398 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.119372 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0e291-predictor-748c9cfd47-j5dmj"] Apr 22 19:06:42.991097 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:42.991062 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" path="/var/lib/kubelet/pods/d9c07e2c-f953-4b9e-8241-dbf9834c901f/volumes" Apr 22 19:06:45.380762 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:45.380725 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:47.078359 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:47.078332 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:06:47.078912 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:47.078887 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 19:06:50.380964 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:50.380916 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:50.381479 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:50.381078 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 19:06:50.953446 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:50.953419 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:06:52.940623 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:52.940594 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:06:52.943286 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:52.943251 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:06:55.380645 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:55.380607 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:06:57.079163 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:06:57.079115 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 19:07:00.379928 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:00.379891 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:02.579502 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.579465 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd"] Apr 22 19:07:02.580042 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.580022 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kube-rbac-proxy" Apr 22 19:07:02.580122 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.580045 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kube-rbac-proxy" Apr 22 19:07:02.580122 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.580071 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" Apr 22 19:07:02.580122 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.580079 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" Apr 22 19:07:02.580301 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.580169 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kserve-container" Apr 22 19:07:02.580301 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.580189 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9c07e2c-f953-4b9e-8241-dbf9834c901f" containerName="kube-rbac-proxy" Apr 22 19:07:02.584786 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.584763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:02.587599 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.587574 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-50a5e-serving-cert\"" Apr 22 19:07:02.588474 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.588456 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-50a5e-kube-rbac-proxy-sar-config\"" Apr 22 19:07:02.597829 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.597799 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd"] Apr 22 19:07:02.610135 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.610112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:02.610255 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.610154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251f818-24d3-40fb-9348-f235eaa96931-openshift-service-ca-bundle\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:02.711094 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.711052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251f818-24d3-40fb-9348-f235eaa96931-openshift-service-ca-bundle\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:02.711263 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.711160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:02.711263 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:02.711249 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-50a5e-serving-cert: secret "ensemble-graph-50a5e-serving-cert" not found Apr 22 19:07:02.711363 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:02.711317 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls podName:8251f818-24d3-40fb-9348-f235eaa96931 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:03.211302356 +0000 UTC m=+1210.735965884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls") pod "ensemble-graph-50a5e-dbdd56cbb-hcvhd" (UID: "8251f818-24d3-40fb-9348-f235eaa96931") : secret "ensemble-graph-50a5e-serving-cert" not found Apr 22 19:07:02.711794 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:02.711771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251f818-24d3-40fb-9348-f235eaa96931-openshift-service-ca-bundle\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:03.215806 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:03.215774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:03.218135 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:03.218115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls\") pod \"ensemble-graph-50a5e-dbdd56cbb-hcvhd\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:03.495050 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:03.494946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:03.616194 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:03.616168 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd"] Apr 22 19:07:03.618814 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:07:03.618782 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8251f818_24d3_40fb_9348_f235eaa96931.slice/crio-45efb7403accb7b18dff9742e03d4ba2294f901570d3b8d4bad3e9a22bb87ef9 WatchSource:0}: Error finding container 45efb7403accb7b18dff9742e03d4ba2294f901570d3b8d4bad3e9a22bb87ef9: Status 404 returned error can't find the container with id 45efb7403accb7b18dff9742e03d4ba2294f901570d3b8d4bad3e9a22bb87ef9 Apr 22 19:07:04.147041 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:04.147007 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" event={"ID":"8251f818-24d3-40fb-9348-f235eaa96931","Type":"ContainerStarted","Data":"4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be"} Apr 22 19:07:04.147041 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:04.147045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" event={"ID":"8251f818-24d3-40fb-9348-f235eaa96931","Type":"ContainerStarted","Data":"45efb7403accb7b18dff9742e03d4ba2294f901570d3b8d4bad3e9a22bb87ef9"} Apr 22 19:07:04.147343 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:04.147070 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:04.163641 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:04.163596 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podStartSLOduration=2.1635833939999998 podStartE2EDuration="2.163583394s" podCreationTimestamp="2026-04-22 19:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:04.162499599 +0000 UTC m=+1211.687163151" watchObservedRunningTime="2026-04-22 19:07:04.163583394 +0000 UTC m=+1211.688246945" Apr 22 19:07:05.380738 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:05.380699 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:07.078999 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:07.078961 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 19:07:08.443447 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.443420 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 19:07:08.560897 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.560810 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b81c6a1-d055-4e93-91a8-786d4c7267e3-openshift-service-ca-bundle\") pod \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " Apr 22 19:07:08.560897 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.560858 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b81c6a1-d055-4e93-91a8-786d4c7267e3-proxy-tls\") pod \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\" (UID: \"2b81c6a1-d055-4e93-91a8-786d4c7267e3\") " Apr 22 19:07:08.561239 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.561215 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b81c6a1-d055-4e93-91a8-786d4c7267e3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2b81c6a1-d055-4e93-91a8-786d4c7267e3" (UID: "2b81c6a1-d055-4e93-91a8-786d4c7267e3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:08.562901 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.562882 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b81c6a1-d055-4e93-91a8-786d4c7267e3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2b81c6a1-d055-4e93-91a8-786d4c7267e3" (UID: "2b81c6a1-d055-4e93-91a8-786d4c7267e3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:08.661798 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.661762 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b81c6a1-d055-4e93-91a8-786d4c7267e3-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:08.661798 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:08.661792 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b81c6a1-d055-4e93-91a8-786d4c7267e3-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:09.164032 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.164002 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerID="253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952" exitCode=0 Apr 22 19:07:09.164232 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.164057 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" Apr 22 19:07:09.164232 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.164063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" event={"ID":"2b81c6a1-d055-4e93-91a8-786d4c7267e3","Type":"ContainerDied","Data":"253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952"} Apr 22 19:07:09.164232 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.164092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j" event={"ID":"2b81c6a1-d055-4e93-91a8-786d4c7267e3","Type":"ContainerDied","Data":"60b9b4e89e83118ba27940ad52bd11c7150c65ef65fe07cc0bdadfe9d54b8c6c"} Apr 22 19:07:09.164232 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.164107 2577 scope.go:117] "RemoveContainer" containerID="253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952" Apr 22 19:07:09.172128 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.172111 2577 scope.go:117] "RemoveContainer" containerID="253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952" Apr 22 19:07:09.172433 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:09.172410 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952\": container with ID starting with 253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952 not found: ID does not exist" containerID="253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952" Apr 22 19:07:09.172524 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.172443 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952"} err="failed to get container status \"253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952\": rpc error: code = NotFound desc = could not find container \"253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952\": container with ID starting with 253cd2f99a0c6443cde0431f6cd886ad3a960fae55f8c55592ff409afaa78952 not found: ID does not exist" Apr 22 19:07:09.180598 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.180576 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j"] Apr 22 19:07:09.188255 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:09.188227 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0e291-767bbd5cdd-qwg5j"] Apr 22 19:07:10.156192 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:10.156157 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:10.991849 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:10.991816 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" path="/var/lib/kubelet/pods/2b81c6a1-d055-4e93-91a8-786d4c7267e3/volumes" Apr 22 19:07:12.640820 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.640736 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd"] Apr 22 19:07:12.641176 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.640945 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" containerID="cri-o://4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be" gracePeriod=30 Apr 22 19:07:12.809238 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.809209 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4"] Apr 22 19:07:12.809606 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.809566 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" containerID="cri-o://79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5" gracePeriod=30 Apr 22 19:07:12.809746 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.809566 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kube-rbac-proxy" containerID="cri-o://1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea" gracePeriod=30 Apr 22 19:07:12.843771 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.843741 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2"] Apr 22 19:07:12.844219 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.844203 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" Apr 22 19:07:12.844282 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.844223 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" Apr 22 19:07:12.844329 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.844321 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b81c6a1-d055-4e93-91a8-786d4c7267e3" containerName="sequence-graph-0e291" Apr 22 19:07:12.849130 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.849111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.851569 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.851548 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f6fce-predictor-serving-cert\"" Apr 22 19:07:12.851650 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.851576 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f6fce-kube-rbac-proxy-sar-config\"" Apr 22 19:07:12.856622 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.856604 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2"] Apr 22 19:07:12.897183 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.897097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngg2\" (UniqueName: \"kubernetes.io/projected/e2897854-0854-4157-8b02-eeaae615da73-kube-api-access-xngg2\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.897351 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.897206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2897854-0854-4157-8b02-eeaae615da73-error-404-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.897351 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.897328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.998560 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.998524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xngg2\" (UniqueName: \"kubernetes.io/projected/e2897854-0854-4157-8b02-eeaae615da73-kube-api-access-xngg2\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.998741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.998587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2897854-0854-4157-8b02-eeaae615da73-error-404-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.998741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.998613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:12.998741 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:12.998731 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-serving-cert: secret "error-404-isvc-f6fce-predictor-serving-cert" not found Apr 22 19:07:12.999114 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:12.998793 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls podName:e2897854-0854-4157-8b02-eeaae615da73 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:13.498773538 +0000 UTC m=+1221.023437067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls") pod "error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" (UID: "e2897854-0854-4157-8b02-eeaae615da73") : secret "error-404-isvc-f6fce-predictor-serving-cert" not found Apr 22 19:07:12.999364 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:12.999345 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2897854-0854-4157-8b02-eeaae615da73-error-404-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:13.006849 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:13.006827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngg2\" (UniqueName: \"kubernetes.io/projected/e2897854-0854-4157-8b02-eeaae615da73-kube-api-access-xngg2\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:13.180035 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:13.179952 2577 generic.go:358] "Generic (PLEG): container finished" podID="a321ae25-0e38-4211-b254-595a1170bd90" containerID="1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea" exitCode=2 Apr 22 19:07:13.180035 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:13.180015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" event={"ID":"a321ae25-0e38-4211-b254-595a1170bd90","Type":"ContainerDied","Data":"1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea"} Apr 22 19:07:13.504603 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:13.504562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:13.504817 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:13.504733 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-serving-cert: secret "error-404-isvc-f6fce-predictor-serving-cert" not found Apr 22 19:07:13.504817 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:13.504812 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls podName:e2897854-0854-4157-8b02-eeaae615da73 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:14.504792756 +0000 UTC m=+1222.029456288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls") pod "error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" (UID: "e2897854-0854-4157-8b02-eeaae615da73") : secret "error-404-isvc-f6fce-predictor-serving-cert" not found Apr 22 19:07:14.512541 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:14.512502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:14.515021 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:14.514989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") pod \"error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:14.660117 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:14.660078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:14.798579 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:14.798493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2"] Apr 22 19:07:14.802012 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:07:14.801977 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2897854_0854_4157_8b02_eeaae615da73.slice/crio-081b688f7ed48fd6217fdc4b8459e4f95c0a0d48cc15856e1616ba1f6c5a3d87 WatchSource:0}: Error finding container 081b688f7ed48fd6217fdc4b8459e4f95c0a0d48cc15856e1616ba1f6c5a3d87: Status 404 returned error can't find the container with id 081b688f7ed48fd6217fdc4b8459e4f95c0a0d48cc15856e1616ba1f6c5a3d87 Apr 22 19:07:15.155192 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.155101 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:15.187866 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.187833 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" event={"ID":"e2897854-0854-4157-8b02-eeaae615da73","Type":"ContainerStarted","Data":"928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92"} Apr 22 19:07:15.188025 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.187872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" event={"ID":"e2897854-0854-4157-8b02-eeaae615da73","Type":"ContainerStarted","Data":"4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4"} Apr 22 19:07:15.188025 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.187886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" event={"ID":"e2897854-0854-4157-8b02-eeaae615da73","Type":"ContainerStarted","Data":"081b688f7ed48fd6217fdc4b8459e4f95c0a0d48cc15856e1616ba1f6c5a3d87"} Apr 22 19:07:15.188025 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.187962 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:15.205873 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.205810 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podStartSLOduration=3.205790215 podStartE2EDuration="3.205790215s" podCreationTimestamp="2026-04-22 19:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:15.202784772 +0000 UTC m=+1222.727448324" watchObservedRunningTime="2026-04-22 19:07:15.205790215 +0000 UTC m=+1222.730453770" Apr 22 19:07:15.955367 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:15.955344 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:07:16.028394 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.028358 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2j7p\" (UniqueName: \"kubernetes.io/projected/a321ae25-0e38-4211-b254-595a1170bd90-kube-api-access-t2j7p\") pod \"a321ae25-0e38-4211-b254-595a1170bd90\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " Apr 22 19:07:16.028563 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.028425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls\") pod \"a321ae25-0e38-4211-b254-595a1170bd90\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " Apr 22 19:07:16.028563 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.028444 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a321ae25-0e38-4211-b254-595a1170bd90-error-404-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"a321ae25-0e38-4211-b254-595a1170bd90\" (UID: \"a321ae25-0e38-4211-b254-595a1170bd90\") " Apr 22 19:07:16.028862 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.028834 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a321ae25-0e38-4211-b254-595a1170bd90-error-404-isvc-50a5e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-50a5e-kube-rbac-proxy-sar-config") pod "a321ae25-0e38-4211-b254-595a1170bd90" (UID: "a321ae25-0e38-4211-b254-595a1170bd90"). InnerVolumeSpecName "error-404-isvc-50a5e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:16.030571 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.030547 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a321ae25-0e38-4211-b254-595a1170bd90" (UID: "a321ae25-0e38-4211-b254-595a1170bd90"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:16.030676 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.030594 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a321ae25-0e38-4211-b254-595a1170bd90-kube-api-access-t2j7p" (OuterVolumeSpecName: "kube-api-access-t2j7p") pod "a321ae25-0e38-4211-b254-595a1170bd90" (UID: "a321ae25-0e38-4211-b254-595a1170bd90"). InnerVolumeSpecName "kube-api-access-t2j7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:16.130127 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.130037 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2j7p\" (UniqueName: \"kubernetes.io/projected/a321ae25-0e38-4211-b254-595a1170bd90-kube-api-access-t2j7p\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:16.130127 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.130063 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a321ae25-0e38-4211-b254-595a1170bd90-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:16.130127 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.130075 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a321ae25-0e38-4211-b254-595a1170bd90-error-404-isvc-50a5e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:16.193134 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.193101 2577 generic.go:358] "Generic (PLEG): container finished" podID="a321ae25-0e38-4211-b254-595a1170bd90" containerID="79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5" exitCode=0 Apr 22 19:07:16.193366 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.193169 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" Apr 22 19:07:16.193366 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.193172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" event={"ID":"a321ae25-0e38-4211-b254-595a1170bd90","Type":"ContainerDied","Data":"79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5"} Apr 22 19:07:16.193366 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.193210 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" event={"ID":"a321ae25-0e38-4211-b254-595a1170bd90","Type":"ContainerDied","Data":"388096663fa3e6fc56139ff465a1de5c8556abad20a8951b29af658430b0ef61"} Apr 22 19:07:16.193366 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.193228 2577 scope.go:117] "RemoveContainer" containerID="1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea" Apr 22 19:07:16.193836 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.193803 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:16.195086 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.195055 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 22 19:07:16.202507 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.202448 2577 scope.go:117] "RemoveContainer" containerID="79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5" Apr 22 19:07:16.210833 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.210812 2577 scope.go:117] "RemoveContainer" containerID="1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea" Apr 22 19:07:16.211091 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:16.211072 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea\": container with ID starting with 1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea not found: ID does not exist" containerID="1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea" Apr 22 19:07:16.211144 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.211113 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea"} err="failed to get container status \"1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea\": rpc error: code = NotFound desc = could not find container \"1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea\": container with ID starting with 1d08763223fdedcdaff23904531f2ddbaf415dfe6762245f149c818fbe8665ea not found: ID does not exist" Apr 22 19:07:16.211144 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.211133 2577 scope.go:117] "RemoveContainer" containerID="79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5" Apr 22 19:07:16.211411 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:16.211392 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5\": container with ID starting with 79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5 not found: ID does not exist" containerID="79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5" Apr 22 19:07:16.211465 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.211418 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5"} err="failed to get container status \"79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5\": rpc error: code = NotFound desc = could not find container \"79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5\": container with ID starting with 79616f16c4d1ed36aee8760f3e708042ad0866fb4d1c00421a0aaff2bf2279e5 not found: ID does not exist" Apr 22 19:07:16.215637 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.215613 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4"] Apr 22 19:07:16.221361 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.221340 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4"] Apr 22 19:07:16.948221 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.948177 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-50a5e-predictor-9c6d696b5-r8mz4" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": context deadline exceeded" Apr 22 19:07:16.990751 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:16.990720 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a321ae25-0e38-4211-b254-595a1170bd90" path="/var/lib/kubelet/pods/a321ae25-0e38-4211-b254-595a1170bd90/volumes" Apr 22 19:07:17.078907 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:17.078870 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 19:07:17.197764 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:17.197719 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 22 19:07:20.154919 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:20.154874 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:22.202302 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:22.202250 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:07:22.202742 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:22.202717 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 22 19:07:25.154529 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:25.154486 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:25.154910 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:25.154585 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:27.079421 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:27.079389 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:07:30.154496 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:30.154451 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:32.202830 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:32.202787 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 22 19:07:35.155100 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:35.155059 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:38.448813 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.448778 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz"] Apr 22 19:07:38.449328 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.449242 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kube-rbac-proxy" Apr 22 19:07:38.449328 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.449255 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kube-rbac-proxy" Apr 22 19:07:38.449328 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.449283 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" Apr 22 19:07:38.449328 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.449289 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" Apr 22 19:07:38.449498 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.449368 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kube-rbac-proxy" Apr 22 19:07:38.449498 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.449380 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a321ae25-0e38-4211-b254-595a1170bd90" containerName="kserve-container" Apr 22 19:07:38.453714 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.453695 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.455969 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.455943 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-23058-kube-rbac-proxy-sar-config\"" Apr 22 19:07:38.455969 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.455956 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-23058-serving-cert\"" Apr 22 19:07:38.459609 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.459586 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz"] Apr 22 19:07:38.627723 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.627691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37164fb0-2462-40ed-8929-4012e1477524-openshift-service-ca-bundle\") pod \"sequence-graph-23058-65fbffcfb8-hmthz\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.627723 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.627729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37164fb0-2462-40ed-8929-4012e1477524-proxy-tls\") pod \"sequence-graph-23058-65fbffcfb8-hmthz\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.728995 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.728965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37164fb0-2462-40ed-8929-4012e1477524-openshift-service-ca-bundle\") pod \"sequence-graph-23058-65fbffcfb8-hmthz\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.728995 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.729003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37164fb0-2462-40ed-8929-4012e1477524-proxy-tls\") pod \"sequence-graph-23058-65fbffcfb8-hmthz\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.729622 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.729601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37164fb0-2462-40ed-8929-4012e1477524-openshift-service-ca-bundle\") pod \"sequence-graph-23058-65fbffcfb8-hmthz\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.731463 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.731443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37164fb0-2462-40ed-8929-4012e1477524-proxy-tls\") pod \"sequence-graph-23058-65fbffcfb8-hmthz\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:38.764892 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:38.764864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:39.092136 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:39.092106 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz"] Apr 22 19:07:39.094990 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:07:39.094949 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331 WatchSource:0}: Error finding container ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331: Status 404 returned error can't find the container with id ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331 Apr 22 19:07:39.275429 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:39.275384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" event={"ID":"37164fb0-2462-40ed-8929-4012e1477524","Type":"ContainerStarted","Data":"7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820"} Apr 22 19:07:39.275429 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:39.275433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" event={"ID":"37164fb0-2462-40ed-8929-4012e1477524","Type":"ContainerStarted","Data":"ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331"} Apr 22 19:07:39.275637 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:39.275514 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:39.291859 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:39.291816 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podStartSLOduration=1.291803812 podStartE2EDuration="1.291803812s" podCreationTimestamp="2026-04-22 19:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:39.289155373 +0000 UTC m=+1246.813818920" watchObservedRunningTime="2026-04-22 19:07:39.291803812 +0000 UTC m=+1246.816467362" Apr 22 19:07:40.154825 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:40.154738 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:42.202694 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.202652 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 22 19:07:42.794029 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.794006 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:42.863613 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.863583 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls\") pod \"8251f818-24d3-40fb-9348-f235eaa96931\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " Apr 22 19:07:42.863762 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.863634 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251f818-24d3-40fb-9348-f235eaa96931-openshift-service-ca-bundle\") pod \"8251f818-24d3-40fb-9348-f235eaa96931\" (UID: \"8251f818-24d3-40fb-9348-f235eaa96931\") " Apr 22 19:07:42.863990 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.863958 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8251f818-24d3-40fb-9348-f235eaa96931-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8251f818-24d3-40fb-9348-f235eaa96931" (UID: "8251f818-24d3-40fb-9348-f235eaa96931"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:42.865829 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.865802 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8251f818-24d3-40fb-9348-f235eaa96931" (UID: "8251f818-24d3-40fb-9348-f235eaa96931"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:42.964337 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.964308 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251f818-24d3-40fb-9348-f235eaa96931-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:42.964480 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:42.964340 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251f818-24d3-40fb-9348-f235eaa96931-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:43.289369 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.289257 2577 generic.go:358] "Generic (PLEG): container finished" podID="8251f818-24d3-40fb-9348-f235eaa96931" containerID="4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be" exitCode=0 Apr 22 19:07:43.289369 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.289328 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" Apr 22 19:07:43.289369 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.289341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" event={"ID":"8251f818-24d3-40fb-9348-f235eaa96931","Type":"ContainerDied","Data":"4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be"} Apr 22 19:07:43.289852 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.289377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd" event={"ID":"8251f818-24d3-40fb-9348-f235eaa96931","Type":"ContainerDied","Data":"45efb7403accb7b18dff9742e03d4ba2294f901570d3b8d4bad3e9a22bb87ef9"} Apr 22 19:07:43.289852 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.289392 2577 scope.go:117] "RemoveContainer" containerID="4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be" Apr 22 19:07:43.297928 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.297905 2577 scope.go:117] "RemoveContainer" containerID="4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be" Apr 22 19:07:43.298185 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:43.298166 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be\": container with ID starting with 4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be not found: ID does not exist" containerID="4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be" Apr 22 19:07:43.298264 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.298198 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be"} err="failed to get container status \"4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be\": rpc error: code = NotFound desc = could not find container \"4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be\": container with ID starting with 4eacc023025e2f4f2bec3b3591a9cb192a0d392ea28bbb48334f7352b9fce9be not found: ID does not exist" Apr 22 19:07:43.307517 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.307483 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd"] Apr 22 19:07:43.312646 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:43.312622 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-50a5e-dbdd56cbb-hcvhd"] Apr 22 19:07:44.991145 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:44.991110 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8251f818-24d3-40fb-9348-f235eaa96931" path="/var/lib/kubelet/pods/8251f818-24d3-40fb-9348-f235eaa96931/volumes" Apr 22 19:07:45.284100 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:45.284022 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:07:48.527444 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.527411 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz"] Apr 22 19:07:48.527814 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.527628 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" containerID="cri-o://7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820" gracePeriod=30 Apr 22 19:07:48.724681 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.724644 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96"] Apr 22 19:07:48.725017 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.724986 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" containerID="cri-o://51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248" gracePeriod=30 Apr 22 19:07:48.725111 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.725027 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kube-rbac-proxy" containerID="cri-o://5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3" gracePeriod=30 Apr 22 19:07:48.767199 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.767171 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6"] Apr 22 19:07:48.767580 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.767567 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" Apr 22 19:07:48.767629 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.767582 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" Apr 22 19:07:48.767672 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.767646 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8251f818-24d3-40fb-9348-f235eaa96931" containerName="ensemble-graph-50a5e" Apr 22 19:07:48.770668 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.770652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.773009 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.772990 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ffb29-predictor-serving-cert\"" Apr 22 19:07:48.773075 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.772990 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ffb29-kube-rbac-proxy-sar-config\"" Apr 22 19:07:48.778463 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.778439 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6"] Apr 22 19:07:48.801679 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.801643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5w79\" (UniqueName: \"kubernetes.io/projected/2c07bb2c-1a22-4a32-a847-183c5af71d35-kube-api-access-h5w79\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.801827 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.801702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c07bb2c-1a22-4a32-a847-183c5af71d35-error-404-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.801827 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.801756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.902327 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.902289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5w79\" (UniqueName: \"kubernetes.io/projected/2c07bb2c-1a22-4a32-a847-183c5af71d35-kube-api-access-h5w79\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.902524 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.902370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c07bb2c-1a22-4a32-a847-183c5af71d35-error-404-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.902524 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.902431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.902642 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:48.902573 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-serving-cert: secret "error-404-isvc-ffb29-predictor-serving-cert" not found Apr 22 19:07:48.902642 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:48.902635 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls podName:2c07bb2c-1a22-4a32-a847-183c5af71d35 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:49.402615347 +0000 UTC m=+1256.927278879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls") pod "error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" (UID: "2c07bb2c-1a22-4a32-a847-183c5af71d35") : secret "error-404-isvc-ffb29-predictor-serving-cert" not found Apr 22 19:07:48.903039 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.903013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c07bb2c-1a22-4a32-a847-183c5af71d35-error-404-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:48.910899 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:48.910875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5w79\" (UniqueName: \"kubernetes.io/projected/2c07bb2c-1a22-4a32-a847-183c5af71d35-kube-api-access-h5w79\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:49.313657 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:49.313615 2577 generic.go:358] "Generic (PLEG): container finished" podID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerID="5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3" exitCode=2 Apr 22 19:07:49.313834 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:49.313693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" event={"ID":"27f5dc48-f149-4b40-a291-41d34033c8d2","Type":"ContainerDied","Data":"5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3"} Apr 22 19:07:49.405140 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:49.405111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:49.407471 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:49.407445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls\") pod \"error-404-isvc-ffb29-predictor-564df6b8f9-fssm6\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:49.682013 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:49.681923 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:49.807790 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:49.807763 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6"] Apr 22 19:07:49.809832 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:07:49.809806 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c07bb2c_1a22_4a32_a847_183c5af71d35.slice/crio-26e3b949a5eff8c116a6ed5a6377fe02c1bea9c8959ee5be6c3c5dac4a62ca1e WatchSource:0}: Error finding container 26e3b949a5eff8c116a6ed5a6377fe02c1bea9c8959ee5be6c3c5dac4a62ca1e: Status 404 returned error can't find the container with id 26e3b949a5eff8c116a6ed5a6377fe02c1bea9c8959ee5be6c3c5dac4a62ca1e Apr 22 19:07:50.283810 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:50.283767 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:50.319467 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:50.319433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" event={"ID":"2c07bb2c-1a22-4a32-a847-183c5af71d35","Type":"ContainerStarted","Data":"4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d"} Apr 22 19:07:50.319467 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:50.319472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" event={"ID":"2c07bb2c-1a22-4a32-a847-183c5af71d35","Type":"ContainerStarted","Data":"23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08"} Apr 22 19:07:50.319667 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:50.319483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" event={"ID":"2c07bb2c-1a22-4a32-a847-183c5af71d35","Type":"ContainerStarted","Data":"26e3b949a5eff8c116a6ed5a6377fe02c1bea9c8959ee5be6c3c5dac4a62ca1e"} Apr 22 19:07:50.319667 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:50.319568 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:50.337205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:50.337157 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podStartSLOduration=2.337138134 podStartE2EDuration="2.337138134s" podCreationTimestamp="2026-04-22 19:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:50.334724055 +0000 UTC m=+1257.859387636" watchObservedRunningTime="2026-04-22 19:07:50.337138134 +0000 UTC m=+1257.861801688" Apr 22 19:07:51.323100 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:51.323062 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:51.324435 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:51.324409 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 22 19:07:51.968425 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:51.968398 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:07:52.022287 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.022235 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwmvk\" (UniqueName: \"kubernetes.io/projected/27f5dc48-f149-4b40-a291-41d34033c8d2-kube-api-access-xwmvk\") pod \"27f5dc48-f149-4b40-a291-41d34033c8d2\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " Apr 22 19:07:52.022443 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.022324 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls\") pod \"27f5dc48-f149-4b40-a291-41d34033c8d2\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " Apr 22 19:07:52.022443 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.022362 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27f5dc48-f149-4b40-a291-41d34033c8d2-error-404-isvc-23058-kube-rbac-proxy-sar-config\") pod \"27f5dc48-f149-4b40-a291-41d34033c8d2\" (UID: \"27f5dc48-f149-4b40-a291-41d34033c8d2\") " Apr 22 19:07:52.022789 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.022754 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f5dc48-f149-4b40-a291-41d34033c8d2-error-404-isvc-23058-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-23058-kube-rbac-proxy-sar-config") pod "27f5dc48-f149-4b40-a291-41d34033c8d2" (UID: "27f5dc48-f149-4b40-a291-41d34033c8d2"). InnerVolumeSpecName "error-404-isvc-23058-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:52.024301 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.024261 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f5dc48-f149-4b40-a291-41d34033c8d2-kube-api-access-xwmvk" (OuterVolumeSpecName: "kube-api-access-xwmvk") pod "27f5dc48-f149-4b40-a291-41d34033c8d2" (UID: "27f5dc48-f149-4b40-a291-41d34033c8d2"). InnerVolumeSpecName "kube-api-access-xwmvk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:52.024403 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.024333 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "27f5dc48-f149-4b40-a291-41d34033c8d2" (UID: "27f5dc48-f149-4b40-a291-41d34033c8d2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:52.123410 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.123328 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwmvk\" (UniqueName: \"kubernetes.io/projected/27f5dc48-f149-4b40-a291-41d34033c8d2-kube-api-access-xwmvk\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:52.123410 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.123357 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27f5dc48-f149-4b40-a291-41d34033c8d2-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:52.123410 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.123367 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27f5dc48-f149-4b40-a291-41d34033c8d2-error-404-isvc-23058-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:07:52.203415 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.203371 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 22 19:07:52.327448 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.327415 2577 generic.go:358] "Generic (PLEG): container finished" podID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerID="51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248" exitCode=0 Apr 22 19:07:52.327876 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.327490 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" Apr 22 19:07:52.327876 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.327500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" event={"ID":"27f5dc48-f149-4b40-a291-41d34033c8d2","Type":"ContainerDied","Data":"51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248"} Apr 22 19:07:52.327876 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.327536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96" event={"ID":"27f5dc48-f149-4b40-a291-41d34033c8d2","Type":"ContainerDied","Data":"8f44a8d391ba7cbb4e74a9c59bb271f17113eb2a991ecfc873aae59cec9654a9"} Apr 22 19:07:52.327876 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.327552 2577 scope.go:117] "RemoveContainer" containerID="5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3" Apr 22 19:07:52.328206 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.328176 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 22 19:07:52.338953 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.338936 2577 scope.go:117] "RemoveContainer" containerID="51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248" Apr 22 19:07:52.346090 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.346073 2577 scope.go:117] "RemoveContainer" containerID="5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3" Apr 22 19:07:52.346372 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:52.346350 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3\": container with ID starting with 5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3 not found: ID does not exist" containerID="5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3" Apr 22 19:07:52.346443 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.346381 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3"} err="failed to get container status \"5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3\": rpc error: code = NotFound desc = could not find container \"5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3\": container with ID starting with 5762e5e9fe7107c350dbf9d2b07475df483923a108b4c2415a4385aff61f87e3 not found: ID does not exist" Apr 22 19:07:52.346443 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.346398 2577 scope.go:117] "RemoveContainer" containerID="51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248" Apr 22 19:07:52.346631 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:07:52.346614 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248\": container with ID starting with 51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248 not found: ID does not exist" containerID="51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248" Apr 22 19:07:52.346673 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.346637 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248"} err="failed to get container status \"51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248\": rpc error: code = NotFound desc = could not find container \"51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248\": container with ID starting with 51e30e663159628877deaf54552a498f89fd52dbdc41d051d87f68a9c86c6248 not found: ID does not exist" Apr 22 19:07:52.351641 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.351619 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96"] Apr 22 19:07:52.356713 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.356692 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-23058-predictor-7f696595cc-bzs96"] Apr 22 19:07:52.992243 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:52.992209 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" path="/var/lib/kubelet/pods/27f5dc48-f149-4b40-a291-41d34033c8d2/volumes" Apr 22 19:07:55.283086 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:55.283046 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:07:57.332202 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:57.332176 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:07:57.332696 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:07:57.332669 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 22 19:08:00.282930 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:00.282892 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:08:00.283346 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:00.282987 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:08:02.203911 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:02.203882 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:08:05.283213 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:05.283174 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:08:07.333021 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:07.332976 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 22 19:08:10.283504 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:10.283457 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:08:12.832540 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832506 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v"] Apr 22 19:08:12.832897 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832861 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kube-rbac-proxy" Apr 22 19:08:12.832897 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832871 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kube-rbac-proxy" Apr 22 19:08:12.832897 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832892 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" Apr 22 19:08:12.832897 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832898 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" Apr 22 19:08:12.833026 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832958 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kserve-container" Apr 22 19:08:12.833026 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.832968 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="27f5dc48-f149-4b40-a291-41d34033c8d2" containerName="kube-rbac-proxy" Apr 22 19:08:12.836223 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.836206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:12.838557 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.838530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f6fce-kube-rbac-proxy-sar-config\"" Apr 22 19:08:12.838689 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.838565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f6fce-serving-cert\"" Apr 22 19:08:12.843905 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.843884 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v"] Apr 22 19:08:12.904947 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.904911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:12.905109 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:12.905050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ed591bf-52a9-4670-a732-d500179252ed-openshift-service-ca-bundle\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.006205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.006173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.006385 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.006238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ed591bf-52a9-4670-a732-d500179252ed-openshift-service-ca-bundle\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.006385 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:13.006324 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-f6fce-serving-cert: secret "ensemble-graph-f6fce-serving-cert" not found Apr 22 19:08:13.006385 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:13.006381 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls podName:4ed591bf-52a9-4670-a732-d500179252ed nodeName:}" failed. No retries permitted until 2026-04-22 19:08:13.506365317 +0000 UTC m=+1281.031028851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls") pod "ensemble-graph-f6fce-77754d84cf-hl57v" (UID: "4ed591bf-52a9-4670-a732-d500179252ed") : secret "ensemble-graph-f6fce-serving-cert" not found Apr 22 19:08:13.006820 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.006803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ed591bf-52a9-4670-a732-d500179252ed-openshift-service-ca-bundle\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.511095 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.511061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.513420 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.513388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls\") pod \"ensemble-graph-f6fce-77754d84cf-hl57v\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.747969 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.747934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:13.870677 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:13.870617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v"] Apr 22 19:08:14.405838 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:14.405798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" event={"ID":"4ed591bf-52a9-4670-a732-d500179252ed","Type":"ContainerStarted","Data":"b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a"} Apr 22 19:08:14.405838 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:14.405843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" event={"ID":"4ed591bf-52a9-4670-a732-d500179252ed","Type":"ContainerStarted","Data":"9f1288c1aeeee116ad0266d13d15f019333ba97ebc59ecec0c448c1e72855072"} Apr 22 19:08:14.406049 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:14.405908 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:14.421546 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:14.421501 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podStartSLOduration=2.421488679 podStartE2EDuration="2.421488679s" podCreationTimestamp="2026-04-22 19:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:14.419836806 +0000 UTC m=+1281.944500350" watchObservedRunningTime="2026-04-22 19:08:14.421488679 +0000 UTC m=+1281.946152230" Apr 22 19:08:15.283141 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:15.283102 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:08:17.333735 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:17.333699 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 22 19:08:18.553308 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:18.553258 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:08:18.553686 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:18.553348 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-conmon-7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:08:18.553686 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:18.553364 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331\": RecentStats: unable to find data in memory cache]" Apr 22 19:08:18.553686 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:18.553425 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37164fb0_2462_40ed_8929_4012e1477524.slice/crio-7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:08:18.697832 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.697807 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:08:18.757088 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.757058 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37164fb0-2462-40ed-8929-4012e1477524-openshift-service-ca-bundle\") pod \"37164fb0-2462-40ed-8929-4012e1477524\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " Apr 22 19:08:18.757247 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.757104 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37164fb0-2462-40ed-8929-4012e1477524-proxy-tls\") pod \"37164fb0-2462-40ed-8929-4012e1477524\" (UID: \"37164fb0-2462-40ed-8929-4012e1477524\") " Apr 22 19:08:18.757521 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.757488 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37164fb0-2462-40ed-8929-4012e1477524-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "37164fb0-2462-40ed-8929-4012e1477524" (UID: "37164fb0-2462-40ed-8929-4012e1477524"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:08:18.759177 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.759157 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37164fb0-2462-40ed-8929-4012e1477524-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "37164fb0-2462-40ed-8929-4012e1477524" (UID: "37164fb0-2462-40ed-8929-4012e1477524"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:08:18.858037 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.857947 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37164fb0-2462-40ed-8929-4012e1477524-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:08:18.858037 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:18.857983 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37164fb0-2462-40ed-8929-4012e1477524-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:08:19.423216 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.423179 2577 generic.go:358] "Generic (PLEG): container finished" podID="37164fb0-2462-40ed-8929-4012e1477524" containerID="7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820" exitCode=137 Apr 22 19:08:19.423418 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.423222 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" event={"ID":"37164fb0-2462-40ed-8929-4012e1477524","Type":"ContainerDied","Data":"7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820"} Apr 22 19:08:19.423418 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.423244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" event={"ID":"37164fb0-2462-40ed-8929-4012e1477524","Type":"ContainerDied","Data":"ed429738cdea2040252a13d12549317c7e9d50302dc1d50ba4fd4420a1a8c331"} Apr 22 19:08:19.423418 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.423244 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz" Apr 22 19:08:19.423418 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.423256 2577 scope.go:117] "RemoveContainer" containerID="7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820" Apr 22 19:08:19.431093 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.431079 2577 scope.go:117] "RemoveContainer" containerID="7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820" Apr 22 19:08:19.431358 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:19.431341 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820\": container with ID starting with 7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820 not found: ID does not exist" containerID="7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820" Apr 22 19:08:19.431438 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.431366 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820"} err="failed to get container status \"7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820\": rpc error: code = NotFound desc = could not find container \"7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820\": container with ID starting with 7ac47798024ebecbaa28dd1b5257b0d86b56f52a2c3aeb0b8a2cc079e5cdb820 not found: ID does not exist" Apr 22 19:08:19.438038 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.438012 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz"] Apr 22 19:08:19.441818 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:19.441793 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-23058-65fbffcfb8-hmthz"] Apr 22 19:08:20.414900 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:20.414875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:08:20.990679 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:20.990644 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37164fb0-2462-40ed-8929-4012e1477524" path="/var/lib/kubelet/pods/37164fb0-2462-40ed-8929-4012e1477524/volumes" Apr 22 19:08:27.332722 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:27.332680 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 22 19:08:37.333976 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:37.333944 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:08:48.785387 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.785353 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf"] Apr 22 19:08:48.785981 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.785938 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" Apr 22 19:08:48.785981 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.785961 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" Apr 22 19:08:48.786107 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.786070 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="37164fb0-2462-40ed-8929-4012e1477524" containerName="sequence-graph-23058" Apr 22 19:08:48.789081 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.789059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:48.791485 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.791461 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-ffb29-serving-cert\"" Apr 22 19:08:48.791591 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.791487 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-ffb29-kube-rbac-proxy-sar-config\"" Apr 22 19:08:48.797661 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.797640 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf"] Apr 22 19:08:48.826617 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.826577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:48.826756 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.826635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-openshift-service-ca-bundle\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:48.927692 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.927654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:48.927692 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.927692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-openshift-service-ca-bundle\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:48.927905 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:48.927806 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-ffb29-serving-cert: secret "sequence-graph-ffb29-serving-cert" not found Apr 22 19:08:48.927905 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:08:48.927884 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls podName:7d4f4c46-a02f-4055-864c-69ae1b4a0f05 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:49.427865594 +0000 UTC m=+1316.952529124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls") pod "sequence-graph-ffb29-bb6b7bfb7-9tcpf" (UID: "7d4f4c46-a02f-4055-864c-69ae1b4a0f05") : secret "sequence-graph-ffb29-serving-cert" not found Apr 22 19:08:48.928347 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:48.928330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-openshift-service-ca-bundle\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:49.431207 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:49.431157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:49.433585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:49.433564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls\") pod \"sequence-graph-ffb29-bb6b7bfb7-9tcpf\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:49.700095 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:49.700010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:50.029580 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:50.029527 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf"] Apr 22 19:08:50.032092 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:08:50.032061 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4f4c46_a02f_4055_864c_69ae1b4a0f05.slice/crio-0333f085797853acd63149caf86337807aec367276c6d53470bbc253d7946aa4 WatchSource:0}: Error finding container 0333f085797853acd63149caf86337807aec367276c6d53470bbc253d7946aa4: Status 404 returned error can't find the container with id 0333f085797853acd63149caf86337807aec367276c6d53470bbc253d7946aa4 Apr 22 19:08:50.527129 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:50.527088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" event={"ID":"7d4f4c46-a02f-4055-864c-69ae1b4a0f05","Type":"ContainerStarted","Data":"d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea"} Apr 22 19:08:50.527129 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:50.527127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" event={"ID":"7d4f4c46-a02f-4055-864c-69ae1b4a0f05","Type":"ContainerStarted","Data":"0333f085797853acd63149caf86337807aec367276c6d53470bbc253d7946aa4"} Apr 22 19:08:50.527353 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:50.527152 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:08:50.543663 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:50.543609 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podStartSLOduration=2.543592523 podStartE2EDuration="2.543592523s" podCreationTimestamp="2026-04-22 19:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:50.542550612 +0000 UTC m=+1318.067214176" watchObservedRunningTime="2026-04-22 19:08:50.543592523 +0000 UTC m=+1318.068256077" Apr 22 19:08:56.536845 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:08:56.536816 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:11:52.970778 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:11:52.970750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:11:52.976704 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:11:52.976685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:16:27.395358 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.395323 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v"] Apr 22 19:16:27.397845 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.395549 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" containerID="cri-o://b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a" gracePeriod=30 Apr 22 19:16:27.556402 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.556362 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2"] Apr 22 19:16:27.556767 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.556739 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" containerID="cri-o://4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4" gracePeriod=30 Apr 22 19:16:27.556978 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.556799 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kube-rbac-proxy" containerID="cri-o://928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92" gracePeriod=30 Apr 22 19:16:27.631082 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.631047 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj"] Apr 22 19:16:27.634780 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.634759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.637220 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.637199 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c8908-predictor-serving-cert\"" Apr 22 19:16:27.637307 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.637211 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c8908-kube-rbac-proxy-sar-config\"" Apr 22 19:16:27.651228 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.647237 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj"] Apr 22 19:16:27.828718 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.828680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhzzp\" (UniqueName: \"kubernetes.io/projected/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-kube-api-access-mhzzp\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.828907 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.828810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-proxy-tls\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.828907 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.828877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-error-404-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.930221 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.930125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-proxy-tls\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.930221 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.930183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-error-404-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.930458 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.930262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhzzp\" (UniqueName: \"kubernetes.io/projected/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-kube-api-access-mhzzp\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.930953 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.930919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-error-404-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.932607 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.932585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-proxy-tls\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.938044 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.938026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhzzp\" (UniqueName: \"kubernetes.io/projected/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-kube-api-access-mhzzp\") pod \"error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:27.952917 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:27.952888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:28.076044 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:28.076020 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj"] Apr 22 19:16:28.077848 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:16:28.077819 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e7e6aa_3ad6_42ee_a761_ddf4d8bf0141.slice/crio-c62c5a7c05f2d08bfac010ec9ec54365dca634f67abd08b8887cefbc089a7f91 WatchSource:0}: Error finding container c62c5a7c05f2d08bfac010ec9ec54365dca634f67abd08b8887cefbc089a7f91: Status 404 returned error can't find the container with id c62c5a7c05f2d08bfac010ec9ec54365dca634f67abd08b8887cefbc089a7f91 Apr 22 19:16:28.082563 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:28.082544 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:16:28.086012 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:28.085988 2577 generic.go:358] "Generic (PLEG): container finished" podID="e2897854-0854-4157-8b02-eeaae615da73" containerID="928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92" exitCode=2 Apr 22 19:16:28.086125 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:28.086073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" event={"ID":"e2897854-0854-4157-8b02-eeaae615da73","Type":"ContainerDied","Data":"928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92"} Apr 22 19:16:28.087192 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:28.087170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" event={"ID":"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141","Type":"ContainerStarted","Data":"c62c5a7c05f2d08bfac010ec9ec54365dca634f67abd08b8887cefbc089a7f91"} Apr 22 19:16:29.091752 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:29.091720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" event={"ID":"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141","Type":"ContainerStarted","Data":"3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466"} Apr 22 19:16:29.091752 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:29.091756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" event={"ID":"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141","Type":"ContainerStarted","Data":"e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197"} Apr 22 19:16:29.092448 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:29.091891 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:29.111915 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:29.111860 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podStartSLOduration=2.111846083 podStartE2EDuration="2.111846083s" podCreationTimestamp="2026-04-22 19:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:16:29.110950488 +0000 UTC m=+1776.635614037" watchObservedRunningTime="2026-04-22 19:16:29.111846083 +0000 UTC m=+1776.636509634" Apr 22 19:16:30.095668 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.095629 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:30.096902 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.096871 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 19:16:30.413336 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.413228 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:30.908775 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.908751 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:16:30.952374 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.952291 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") pod \"e2897854-0854-4157-8b02-eeaae615da73\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " Apr 22 19:16:30.952524 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.952410 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2897854-0854-4157-8b02-eeaae615da73-error-404-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"e2897854-0854-4157-8b02-eeaae615da73\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " Apr 22 19:16:30.952524 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.952441 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xngg2\" (UniqueName: \"kubernetes.io/projected/e2897854-0854-4157-8b02-eeaae615da73-kube-api-access-xngg2\") pod \"e2897854-0854-4157-8b02-eeaae615da73\" (UID: \"e2897854-0854-4157-8b02-eeaae615da73\") " Apr 22 19:16:30.952881 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.952850 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2897854-0854-4157-8b02-eeaae615da73-error-404-isvc-f6fce-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f6fce-kube-rbac-proxy-sar-config") pod "e2897854-0854-4157-8b02-eeaae615da73" (UID: "e2897854-0854-4157-8b02-eeaae615da73"). InnerVolumeSpecName "error-404-isvc-f6fce-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:16:30.954585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.954555 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e2897854-0854-4157-8b02-eeaae615da73" (UID: "e2897854-0854-4157-8b02-eeaae615da73"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:16:30.954813 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:30.954792 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2897854-0854-4157-8b02-eeaae615da73-kube-api-access-xngg2" (OuterVolumeSpecName: "kube-api-access-xngg2") pod "e2897854-0854-4157-8b02-eeaae615da73" (UID: "e2897854-0854-4157-8b02-eeaae615da73"). InnerVolumeSpecName "kube-api-access-xngg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:31.053581 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.053539 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2897854-0854-4157-8b02-eeaae615da73-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.053581 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.053575 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2897854-0854-4157-8b02-eeaae615da73-error-404-isvc-f6fce-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.053581 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.053587 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xngg2\" (UniqueName: \"kubernetes.io/projected/e2897854-0854-4157-8b02-eeaae615da73-kube-api-access-xngg2\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.100860 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.100823 2577 generic.go:358] "Generic (PLEG): container finished" podID="e2897854-0854-4157-8b02-eeaae615da73" containerID="4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4" exitCode=0 Apr 22 19:16:31.101316 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.100901 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" Apr 22 19:16:31.101316 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.100916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" event={"ID":"e2897854-0854-4157-8b02-eeaae615da73","Type":"ContainerDied","Data":"4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4"} Apr 22 19:16:31.101316 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.100954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2" event={"ID":"e2897854-0854-4157-8b02-eeaae615da73","Type":"ContainerDied","Data":"081b688f7ed48fd6217fdc4b8459e4f95c0a0d48cc15856e1616ba1f6c5a3d87"} Apr 22 19:16:31.101316 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.100969 2577 scope.go:117] "RemoveContainer" containerID="928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92" Apr 22 19:16:31.101705 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.101677 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 19:16:31.109187 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.109167 2577 scope.go:117] "RemoveContainer" containerID="4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4" Apr 22 19:16:31.117026 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.117003 2577 scope.go:117] "RemoveContainer" containerID="928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92" Apr 22 19:16:31.117377 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:16:31.117347 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92\": container with ID starting with 928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92 not found: ID does not exist" containerID="928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92" Apr 22 19:16:31.117488 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.117398 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92"} err="failed to get container status \"928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92\": rpc error: code = NotFound desc = could not find container \"928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92\": container with ID starting with 928f010e70aec4677a1175794158c2373bc2f0c62438603104de28e20d0aac92 not found: ID does not exist" Apr 22 19:16:31.117488 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.117425 2577 scope.go:117] "RemoveContainer" containerID="4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4" Apr 22 19:16:31.117865 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:16:31.117829 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4\": container with ID starting with 4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4 not found: ID does not exist" containerID="4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4" Apr 22 19:16:31.118003 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.117887 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4"} err="failed to get container status \"4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4\": rpc error: code = NotFound desc = could not find container \"4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4\": container with ID starting with 4ed093b526e2b1091c21d5403667b41237e2502a3711b2a233357746cfe061c4 not found: ID does not exist" Apr 22 19:16:31.120981 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.120960 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2"] Apr 22 19:16:31.122741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:31.122720 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6fce-predictor-98bc8cf4f-nd4v2"] Apr 22 19:16:32.990427 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:32.990382 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2897854-0854-4157-8b02-eeaae615da73" path="/var/lib/kubelet/pods/e2897854-0854-4157-8b02-eeaae615da73/volumes" Apr 22 19:16:35.413105 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:35.413067 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:36.106889 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:36.106863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:16:36.107466 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:36.107440 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 19:16:40.413278 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:40.413175 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:40.413645 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:40.413338 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:16:45.413025 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:45.412987 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:46.107896 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:46.107849 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 19:16:50.414068 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:50.413977 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:53.006805 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:53.006773 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:16:53.010174 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:53.010151 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:16:55.413672 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:55.413620 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:56.107717 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:56.107677 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 19:16:57.545806 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.545510 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:16:57.679137 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.679051 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls\") pod \"4ed591bf-52a9-4670-a732-d500179252ed\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " Apr 22 19:16:57.679137 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.679100 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ed591bf-52a9-4670-a732-d500179252ed-openshift-service-ca-bundle\") pod \"4ed591bf-52a9-4670-a732-d500179252ed\" (UID: \"4ed591bf-52a9-4670-a732-d500179252ed\") " Apr 22 19:16:57.679519 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.679494 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed591bf-52a9-4670-a732-d500179252ed-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4ed591bf-52a9-4670-a732-d500179252ed" (UID: "4ed591bf-52a9-4670-a732-d500179252ed"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:16:57.681131 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.681105 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ed591bf-52a9-4670-a732-d500179252ed" (UID: "4ed591bf-52a9-4670-a732-d500179252ed"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:16:57.780597 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.780561 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed591bf-52a9-4670-a732-d500179252ed-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:16:57.780597 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:57.780596 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ed591bf-52a9-4670-a732-d500179252ed-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:16:58.199855 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.199817 2577 generic.go:358] "Generic (PLEG): container finished" podID="4ed591bf-52a9-4670-a732-d500179252ed" containerID="b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a" exitCode=0 Apr 22 19:16:58.200041 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.199879 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" Apr 22 19:16:58.200041 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.199879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" event={"ID":"4ed591bf-52a9-4670-a732-d500179252ed","Type":"ContainerDied","Data":"b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a"} Apr 22 19:16:58.200041 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.199977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v" event={"ID":"4ed591bf-52a9-4670-a732-d500179252ed","Type":"ContainerDied","Data":"9f1288c1aeeee116ad0266d13d15f019333ba97ebc59ecec0c448c1e72855072"} Apr 22 19:16:58.200041 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.199992 2577 scope.go:117] "RemoveContainer" containerID="b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a" Apr 22 19:16:58.208522 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.208500 2577 scope.go:117] "RemoveContainer" containerID="b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a" Apr 22 19:16:58.208784 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:16:58.208763 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a\": container with ID starting with b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a not found: ID does not exist" containerID="b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a" Apr 22 19:16:58.208867 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.208791 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a"} err="failed to get container status \"b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a\": rpc error: code = NotFound desc = could not find container \"b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a\": container with ID starting with b963b20f7d26b66d49a581f4da8447b04ecc043bb38034138afb7d863b71329a not found: ID does not exist" Apr 22 19:16:58.221399 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.221370 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v"] Apr 22 19:16:58.225433 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.225407 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6fce-77754d84cf-hl57v"] Apr 22 19:16:58.991417 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:16:58.991385 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed591bf-52a9-4670-a732-d500179252ed" path="/var/lib/kubelet/pods/4ed591bf-52a9-4670-a732-d500179252ed/volumes" Apr 22 19:17:03.321162 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.321120 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf"] Apr 22 19:17:03.321665 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.321432 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" containerID="cri-o://d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea" gracePeriod=30 Apr 22 19:17:03.488440 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.488408 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6"] Apr 22 19:17:03.488753 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.488723 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" containerID="cri-o://23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08" gracePeriod=30 Apr 22 19:17:03.488887 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.488769 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kube-rbac-proxy" containerID="cri-o://4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d" gracePeriod=30 Apr 22 19:17:03.534543 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534516 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl"] Apr 22 19:17:03.534882 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534870 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" Apr 22 19:17:03.534962 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534883 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" Apr 22 19:17:03.534962 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534900 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" Apr 22 19:17:03.534962 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534906 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" Apr 22 19:17:03.534962 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534913 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kube-rbac-proxy" Apr 22 19:17:03.534962 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534919 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kube-rbac-proxy" Apr 22 19:17:03.535200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534989 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ed591bf-52a9-4670-a732-d500179252ed" containerName="ensemble-graph-f6fce" Apr 22 19:17:03.535200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.534998 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kube-rbac-proxy" Apr 22 19:17:03.535200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.535006 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2897854-0854-4157-8b02-eeaae615da73" containerName="kserve-container" Apr 22 19:17:03.539659 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.539643 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.542243 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.542224 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e7803-predictor-serving-cert\"" Apr 22 19:17:03.542366 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.542320 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e7803-kube-rbac-proxy-sar-config\"" Apr 22 19:17:03.547302 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.547262 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl"] Apr 22 19:17:03.634705 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.634662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-error-404-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.634882 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.634715 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.634882 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.634825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nxg\" (UniqueName: \"kubernetes.io/projected/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-kube-api-access-q9nxg\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.735332 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.735289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nxg\" (UniqueName: \"kubernetes.io/projected/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-kube-api-access-q9nxg\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.735506 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.735393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-error-404-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.735506 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.735435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.735618 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:03.735574 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-e7803-predictor-serving-cert: secret "error-404-isvc-e7803-predictor-serving-cert" not found Apr 22 19:17:03.735673 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:03.735659 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls podName:55f046b3-38f6-4451-b8a1-cf7b0bdfc432 nodeName:}" failed. No retries permitted until 2026-04-22 19:17:04.235639826 +0000 UTC m=+1811.760303370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls") pod "error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" (UID: "55f046b3-38f6-4451-b8a1-cf7b0bdfc432") : secret "error-404-isvc-e7803-predictor-serving-cert" not found Apr 22 19:17:03.736076 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.736048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-error-404-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:03.743969 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:03.743940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nxg\" (UniqueName: \"kubernetes.io/projected/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-kube-api-access-q9nxg\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:04.222337 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:04.222295 2577 generic.go:358] "Generic (PLEG): container finished" podID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerID="4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d" exitCode=2 Apr 22 19:17:04.222505 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:04.222350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" event={"ID":"2c07bb2c-1a22-4a32-a847-183c5af71d35","Type":"ContainerDied","Data":"4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d"} Apr 22 19:17:04.240726 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:04.240695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:04.243076 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:04.243054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls\") pod \"error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:04.450846 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:04.450812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:04.570678 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:04.570645 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl"] Apr 22 19:17:04.574236 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:17:04.574200 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f046b3_38f6_4451_b8a1_cf7b0bdfc432.slice/crio-9b22cf778a511912fadbc4247be69db31a1b6290a467f495f08d43ab33bb41c4 WatchSource:0}: Error finding container 9b22cf778a511912fadbc4247be69db31a1b6290a467f495f08d43ab33bb41c4: Status 404 returned error can't find the container with id 9b22cf778a511912fadbc4247be69db31a1b6290a467f495f08d43ab33bb41c4 Apr 22 19:17:05.227886 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:05.227854 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" event={"ID":"55f046b3-38f6-4451-b8a1-cf7b0bdfc432","Type":"ContainerStarted","Data":"d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0"} Apr 22 19:17:05.227886 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:05.227889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" event={"ID":"55f046b3-38f6-4451-b8a1-cf7b0bdfc432","Type":"ContainerStarted","Data":"08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681"} Apr 22 19:17:05.228117 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:05.227901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" event={"ID":"55f046b3-38f6-4451-b8a1-cf7b0bdfc432","Type":"ContainerStarted","Data":"9b22cf778a511912fadbc4247be69db31a1b6290a467f495f08d43ab33bb41c4"} Apr 22 19:17:05.228117 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:05.228020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:05.246053 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:05.246012 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podStartSLOduration=2.245998759 podStartE2EDuration="2.245998759s" podCreationTimestamp="2026-04-22 19:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:17:05.243683731 +0000 UTC m=+1812.768347316" watchObservedRunningTime="2026-04-22 19:17:05.245998759 +0000 UTC m=+1812.770662309" Apr 22 19:17:06.107745 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.107703 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 19:17:06.231201 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.231170 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:06.232660 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.232636 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 19:17:06.535450 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.535418 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:06.748563 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.748533 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:17:06.868472 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.868378 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5w79\" (UniqueName: \"kubernetes.io/projected/2c07bb2c-1a22-4a32-a847-183c5af71d35-kube-api-access-h5w79\") pod \"2c07bb2c-1a22-4a32-a847-183c5af71d35\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " Apr 22 19:17:06.868472 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.868424 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls\") pod \"2c07bb2c-1a22-4a32-a847-183c5af71d35\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " Apr 22 19:17:06.868715 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.868559 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c07bb2c-1a22-4a32-a847-183c5af71d35-error-404-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"2c07bb2c-1a22-4a32-a847-183c5af71d35\" (UID: \"2c07bb2c-1a22-4a32-a847-183c5af71d35\") " Apr 22 19:17:06.868905 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.868875 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c07bb2c-1a22-4a32-a847-183c5af71d35-error-404-isvc-ffb29-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ffb29-kube-rbac-proxy-sar-config") pod "2c07bb2c-1a22-4a32-a847-183c5af71d35" (UID: "2c07bb2c-1a22-4a32-a847-183c5af71d35"). InnerVolumeSpecName "error-404-isvc-ffb29-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:17:06.870616 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.870591 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2c07bb2c-1a22-4a32-a847-183c5af71d35" (UID: "2c07bb2c-1a22-4a32-a847-183c5af71d35"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:17:06.870727 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.870636 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c07bb2c-1a22-4a32-a847-183c5af71d35-kube-api-access-h5w79" (OuterVolumeSpecName: "kube-api-access-h5w79") pod "2c07bb2c-1a22-4a32-a847-183c5af71d35" (UID: "2c07bb2c-1a22-4a32-a847-183c5af71d35"). InnerVolumeSpecName "kube-api-access-h5w79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:06.970101 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.970064 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5w79\" (UniqueName: \"kubernetes.io/projected/2c07bb2c-1a22-4a32-a847-183c5af71d35-kube-api-access-h5w79\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:06.970101 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.970095 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07bb2c-1a22-4a32-a847-183c5af71d35-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:06.970101 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:06.970108 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c07bb2c-1a22-4a32-a847-183c5af71d35-error-404-isvc-ffb29-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:07.235458 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.235426 2577 generic.go:358] "Generic (PLEG): container finished" podID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerID="23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08" exitCode=0 Apr 22 19:17:07.235894 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.235509 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" Apr 22 19:17:07.235894 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.235512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" event={"ID":"2c07bb2c-1a22-4a32-a847-183c5af71d35","Type":"ContainerDied","Data":"23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08"} Apr 22 19:17:07.235894 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.235560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6" event={"ID":"2c07bb2c-1a22-4a32-a847-183c5af71d35","Type":"ContainerDied","Data":"26e3b949a5eff8c116a6ed5a6377fe02c1bea9c8959ee5be6c3c5dac4a62ca1e"} Apr 22 19:17:07.235894 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.235584 2577 scope.go:117] "RemoveContainer" containerID="4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d" Apr 22 19:17:07.236137 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.235997 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 19:17:07.244380 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.244361 2577 scope.go:117] "RemoveContainer" containerID="23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08" Apr 22 19:17:07.252319 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.252297 2577 scope.go:117] "RemoveContainer" containerID="4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d" Apr 22 19:17:07.252577 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:07.252552 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d\": container with ID starting with 4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d not found: ID does not exist" containerID="4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d" Apr 22 19:17:07.252694 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.252587 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d"} err="failed to get container status \"4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d\": rpc error: code = NotFound desc = could not find container \"4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d\": container with ID starting with 4ad242a3a99ad797f1f0115e6358b9e56490890b42df2a1eca4a2a3fdd92172d not found: ID does not exist" Apr 22 19:17:07.252694 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.252607 2577 scope.go:117] "RemoveContainer" containerID="23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08" Apr 22 19:17:07.252861 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:07.252843 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08\": container with ID starting with 23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08 not found: ID does not exist" containerID="23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08" Apr 22 19:17:07.252902 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.252867 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08"} err="failed to get container status \"23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08\": rpc error: code = NotFound desc = could not find container \"23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08\": container with ID starting with 23eb7578267338d21daccea438a72b8c96ea7670b274bb1cd33d2b66d08c6c08 not found: ID does not exist" Apr 22 19:17:07.253184 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.253167 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6"] Apr 22 19:17:07.257361 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:07.257340 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ffb29-predictor-564df6b8f9-fssm6"] Apr 22 19:17:08.991123 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:08.991088 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" path="/var/lib/kubelet/pods/2c07bb2c-1a22-4a32-a847-183c5af71d35/volumes" Apr 22 19:17:11.534846 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:11.534810 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:12.240694 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:12.240662 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:12.241188 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:12.241153 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 19:17:16.109098 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:16.109064 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:17:16.534462 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:16.534423 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:16.534635 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:16.534556 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:17:21.535307 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:21.535247 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:22.241299 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:22.241238 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 19:17:26.534599 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:26.534555 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:31.535497 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:31.535457 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:32.241654 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:32.241600 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 19:17:33.471574 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.471551 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:17:33.501620 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.501576 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-openshift-service-ca-bundle\") pod \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " Apr 22 19:17:33.501781 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.501655 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls\") pod \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\" (UID: \"7d4f4c46-a02f-4055-864c-69ae1b4a0f05\") " Apr 22 19:17:33.501988 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.501963 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7d4f4c46-a02f-4055-864c-69ae1b4a0f05" (UID: "7d4f4c46-a02f-4055-864c-69ae1b4a0f05"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:17:33.503699 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.503676 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7d4f4c46-a02f-4055-864c-69ae1b4a0f05" (UID: "7d4f4c46-a02f-4055-864c-69ae1b4a0f05"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:17:33.603169 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.603079 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:33.603169 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:33.603124 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d4f4c46-a02f-4055-864c-69ae1b4a0f05-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:34.335544 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.335509 2577 generic.go:358] "Generic (PLEG): container finished" podID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerID="d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea" exitCode=0 Apr 22 19:17:34.335741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.335584 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" Apr 22 19:17:34.335741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.335592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" event={"ID":"7d4f4c46-a02f-4055-864c-69ae1b4a0f05","Type":"ContainerDied","Data":"d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea"} Apr 22 19:17:34.335741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.335632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf" event={"ID":"7d4f4c46-a02f-4055-864c-69ae1b4a0f05","Type":"ContainerDied","Data":"0333f085797853acd63149caf86337807aec367276c6d53470bbc253d7946aa4"} Apr 22 19:17:34.335741 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.335648 2577 scope.go:117] "RemoveContainer" containerID="d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea" Apr 22 19:17:34.344247 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.344230 2577 scope.go:117] "RemoveContainer" containerID="d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea" Apr 22 19:17:34.344526 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:34.344510 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea\": container with ID starting with d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea not found: ID does not exist" containerID="d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea" Apr 22 19:17:34.344571 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.344534 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea"} err="failed to get container status \"d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea\": rpc error: code = NotFound desc = could not find container \"d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea\": container with ID starting with d551ef26fc74501bc2328530fe79e44db1facb725b085a6244df902aac7f5aea not found: ID does not exist" Apr 22 19:17:34.356281 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.356208 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf"] Apr 22 19:17:34.360524 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.360495 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-ffb29-bb6b7bfb7-9tcpf"] Apr 22 19:17:34.991630 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:34.991588 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" path="/var/lib/kubelet/pods/7d4f4c46-a02f-4055-864c-69ae1b4a0f05/volumes" Apr 22 19:17:37.656144 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656108 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p"] Apr 22 19:17:37.656577 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656563 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" Apr 22 19:17:37.656621 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656579 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" Apr 22 19:17:37.656621 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656592 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" Apr 22 19:17:37.656621 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656598 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" Apr 22 19:17:37.656621 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656618 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kube-rbac-proxy" Apr 22 19:17:37.656752 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656624 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kube-rbac-proxy" Apr 22 19:17:37.656752 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656687 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kube-rbac-proxy" Apr 22 19:17:37.656752 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656697 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d4f4c46-a02f-4055-864c-69ae1b4a0f05" containerName="sequence-graph-ffb29" Apr 22 19:17:37.656752 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.656708 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c07bb2c-1a22-4a32-a847-183c5af71d35" containerName="kserve-container" Apr 22 19:17:37.661121 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.661103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:37.663967 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.663708 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-c8908-kube-rbac-proxy-sar-config\"" Apr 22 19:17:37.663967 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.663849 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-c8908-serving-cert\"" Apr 22 19:17:37.666018 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.665991 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p"] Apr 22 19:17:37.741566 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.741533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:37.741727 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.741621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b16177f8-1c92-4b97-8749-13a66888dea4-openshift-service-ca-bundle\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:37.842074 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.842035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:37.842237 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.842124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b16177f8-1c92-4b97-8749-13a66888dea4-openshift-service-ca-bundle\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:37.842237 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:37.842198 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-c8908-serving-cert: secret "splitter-graph-c8908-serving-cert" not found Apr 22 19:17:37.842336 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:37.842295 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls podName:b16177f8-1c92-4b97-8749-13a66888dea4 nodeName:}" failed. No retries permitted until 2026-04-22 19:17:38.342250844 +0000 UTC m=+1845.866914387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls") pod "splitter-graph-c8908-6bc4bd4ff9-82b7p" (UID: "b16177f8-1c92-4b97-8749-13a66888dea4") : secret "splitter-graph-c8908-serving-cert" not found Apr 22 19:17:37.842778 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:37.842757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b16177f8-1c92-4b97-8749-13a66888dea4-openshift-service-ca-bundle\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:38.346971 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:38.346936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:38.349517 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:38.349489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls\") pod \"splitter-graph-c8908-6bc4bd4ff9-82b7p\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:38.573351 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:38.573315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:38.698578 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:38.698551 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p"] Apr 22 19:17:38.701007 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:17:38.700969 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-5cd45f54d5b60e36c6d970a020392c1748a0cb0e8407d7175b1e15edd823e388 WatchSource:0}: Error finding container 5cd45f54d5b60e36c6d970a020392c1748a0cb0e8407d7175b1e15edd823e388: Status 404 returned error can't find the container with id 5cd45f54d5b60e36c6d970a020392c1748a0cb0e8407d7175b1e15edd823e388 Apr 22 19:17:39.356397 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:39.356358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" event={"ID":"b16177f8-1c92-4b97-8749-13a66888dea4","Type":"ContainerStarted","Data":"a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607"} Apr 22 19:17:39.356397 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:39.356398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" event={"ID":"b16177f8-1c92-4b97-8749-13a66888dea4","Type":"ContainerStarted","Data":"5cd45f54d5b60e36c6d970a020392c1748a0cb0e8407d7175b1e15edd823e388"} Apr 22 19:17:39.356655 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:39.356508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:39.373333 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:39.373252 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podStartSLOduration=2.373238922 podStartE2EDuration="2.373238922s" podCreationTimestamp="2026-04-22 19:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:17:39.371110222 +0000 UTC m=+1846.895773784" watchObservedRunningTime="2026-04-22 19:17:39.373238922 +0000 UTC m=+1846.897902470" Apr 22 19:17:42.241560 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:42.241523 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 19:17:45.366519 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:45.366488 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:17:47.715396 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.715363 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p"] Apr 22 19:17:47.715787 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.715569 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" containerID="cri-o://a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607" gracePeriod=30 Apr 22 19:17:47.911427 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.911394 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj"] Apr 22 19:17:47.911731 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.911703 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" containerID="cri-o://e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197" gracePeriod=30 Apr 22 19:17:47.911799 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.911722 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kube-rbac-proxy" containerID="cri-o://3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466" gracePeriod=30 Apr 22 19:17:47.946627 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.946599 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv"] Apr 22 19:17:47.951379 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.951358 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:47.953696 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.953672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-352f4-predictor-serving-cert\"" Apr 22 19:17:47.953812 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.953785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-352f4-kube-rbac-proxy-sar-config\"" Apr 22 19:17:47.958099 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:47.958067 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv"] Apr 22 19:17:48.032565 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.032534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8317a774-5ed9-452a-8edc-6a717007cb36-error-404-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.032693 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.032637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8317a774-5ed9-452a-8edc-6a717007cb36-proxy-tls\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.032693 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.032689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrc7n\" (UniqueName: \"kubernetes.io/projected/8317a774-5ed9-452a-8edc-6a717007cb36-kube-api-access-jrc7n\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.133193 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.133151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8317a774-5ed9-452a-8edc-6a717007cb36-proxy-tls\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.133403 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.133244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrc7n\" (UniqueName: \"kubernetes.io/projected/8317a774-5ed9-452a-8edc-6a717007cb36-kube-api-access-jrc7n\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.133551 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.133521 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8317a774-5ed9-452a-8edc-6a717007cb36-error-404-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.134138 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.134111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8317a774-5ed9-452a-8edc-6a717007cb36-error-404-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.135560 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.135538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8317a774-5ed9-452a-8edc-6a717007cb36-proxy-tls\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.141434 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.141408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrc7n\" (UniqueName: \"kubernetes.io/projected/8317a774-5ed9-452a-8edc-6a717007cb36-kube-api-access-jrc7n\") pod \"error-404-isvc-352f4-predictor-6f5f47949-9nfnv\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.262876 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.262839 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:48.383344 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.383319 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv"] Apr 22 19:17:48.385074 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:17:48.385042 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8317a774_5ed9_452a_8edc_6a717007cb36.slice/crio-0597da79fadff8d28eb82124d9596d654594acd81137f949f43cd27efd41ff5d WatchSource:0}: Error finding container 0597da79fadff8d28eb82124d9596d654594acd81137f949f43cd27efd41ff5d: Status 404 returned error can't find the container with id 0597da79fadff8d28eb82124d9596d654594acd81137f949f43cd27efd41ff5d Apr 22 19:17:48.389449 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.389426 2577 generic.go:358] "Generic (PLEG): container finished" podID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerID="3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466" exitCode=2 Apr 22 19:17:48.389546 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:48.389500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" event={"ID":"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141","Type":"ContainerDied","Data":"3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466"} Apr 22 19:17:49.404306 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:49.404253 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" event={"ID":"8317a774-5ed9-452a-8edc-6a717007cb36","Type":"ContainerStarted","Data":"317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0"} Apr 22 19:17:49.404306 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:49.404309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" event={"ID":"8317a774-5ed9-452a-8edc-6a717007cb36","Type":"ContainerStarted","Data":"cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d"} Apr 22 19:17:49.404813 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:49.404322 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" event={"ID":"8317a774-5ed9-452a-8edc-6a717007cb36","Type":"ContainerStarted","Data":"0597da79fadff8d28eb82124d9596d654594acd81137f949f43cd27efd41ff5d"} Apr 22 19:17:49.404813 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:49.404469 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:49.421939 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:49.421886 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podStartSLOduration=2.421871268 podStartE2EDuration="2.421871268s" podCreationTimestamp="2026-04-22 19:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:17:49.419414786 +0000 UTC m=+1856.944078362" watchObservedRunningTime="2026-04-22 19:17:49.421871268 +0000 UTC m=+1856.946534890" Apr 22 19:17:50.365399 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:50.365300 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:50.408116 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:50.408084 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:50.409283 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:50.409241 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 22 19:17:51.056787 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.056766 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:17:51.157691 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.157603 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-error-404-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " Apr 22 19:17:51.157848 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.157695 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhzzp\" (UniqueName: \"kubernetes.io/projected/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-kube-api-access-mhzzp\") pod \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " Apr 22 19:17:51.157848 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.157753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-proxy-tls\") pod \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\" (UID: \"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141\") " Apr 22 19:17:51.157966 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.157919 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-error-404-isvc-c8908-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-c8908-kube-rbac-proxy-sar-config") pod "42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" (UID: "42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141"). InnerVolumeSpecName "error-404-isvc-c8908-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:17:51.159699 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.159672 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" (UID: "42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:17:51.159699 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.159684 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-kube-api-access-mhzzp" (OuterVolumeSpecName: "kube-api-access-mhzzp") pod "42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" (UID: "42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141"). InnerVolumeSpecName "kube-api-access-mhzzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:51.259340 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.259264 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:51.259340 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.259334 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-error-404-isvc-c8908-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:51.259340 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.259344 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhzzp\" (UniqueName: \"kubernetes.io/projected/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141-kube-api-access-mhzzp\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:17:51.412954 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.412860 2577 generic.go:358] "Generic (PLEG): container finished" podID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerID="e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197" exitCode=0 Apr 22 19:17:51.412954 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.412930 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" Apr 22 19:17:51.413460 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.412952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" event={"ID":"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141","Type":"ContainerDied","Data":"e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197"} Apr 22 19:17:51.413460 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.412999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj" event={"ID":"42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141","Type":"ContainerDied","Data":"c62c5a7c05f2d08bfac010ec9ec54365dca634f67abd08b8887cefbc089a7f91"} Apr 22 19:17:51.413460 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.413019 2577 scope.go:117] "RemoveContainer" containerID="3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466" Apr 22 19:17:51.413634 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.413459 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 22 19:17:51.422108 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.422090 2577 scope.go:117] "RemoveContainer" containerID="e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197" Apr 22 19:17:51.429410 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.429389 2577 scope.go:117] "RemoveContainer" containerID="3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466" Apr 22 19:17:51.429657 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:51.429639 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466\": container with ID starting with 3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466 not found: ID does not exist" containerID="3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466" Apr 22 19:17:51.429704 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.429666 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466"} err="failed to get container status \"3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466\": rpc error: code = NotFound desc = could not find container \"3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466\": container with ID starting with 3113bf51fbaa74431df9352988ed0868e19e30fbc76b6335cbbb746769d9a466 not found: ID does not exist" Apr 22 19:17:51.429704 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.429682 2577 scope.go:117] "RemoveContainer" containerID="e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197" Apr 22 19:17:51.429926 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:17:51.429907 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197\": container with ID starting with e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197 not found: ID does not exist" containerID="e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197" Apr 22 19:17:51.429963 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.429932 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197"} err="failed to get container status \"e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197\": rpc error: code = NotFound desc = could not find container \"e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197\": container with ID starting with e7a701118d9d42d77b5e8334f722f508764d86c8a7a9bb6bcbfe1f5870ca0197 not found: ID does not exist" Apr 22 19:17:51.434499 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.434478 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj"] Apr 22 19:17:51.437821 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:51.437802 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c8908-predictor-7c65ccdfd9-5trtj"] Apr 22 19:17:52.242287 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:52.242251 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:17:52.993879 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:52.993844 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" path="/var/lib/kubelet/pods/42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141/volumes" Apr 22 19:17:55.364855 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:55.364814 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:17:56.417459 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:56.417432 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:17:56.417866 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:17:56.417844 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 22 19:18:00.365141 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:00.365102 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:18:00.365598 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:00.365223 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:18:05.364699 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:05.364660 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:18:06.417903 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:06.417864 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 22 19:18:10.364855 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:10.364769 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:18:13.512789 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.512755 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s"] Apr 22 19:18:13.513178 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.513124 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" Apr 22 19:18:13.513178 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.513134 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" Apr 22 19:18:13.513178 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.513146 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kube-rbac-proxy" Apr 22 19:18:13.513178 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.513152 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kube-rbac-proxy" Apr 22 19:18:13.513342 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.513224 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kserve-container" Apr 22 19:18:13.513342 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.513239 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e7e6aa-3ad6-42ee-a761-ddf4d8bf0141" containerName="kube-rbac-proxy" Apr 22 19:18:13.518050 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.518031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:13.520677 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.520645 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e7803-serving-cert\"" Apr 22 19:18:13.520677 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.520660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e7803-kube-rbac-proxy-sar-config\"" Apr 22 19:18:13.525777 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.525756 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s"] Apr 22 19:18:13.553573 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.553550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:13.553704 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.553636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf542cf9-fe06-43fb-bb3f-00c8830493b4-openshift-service-ca-bundle\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:13.654899 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.654864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf542cf9-fe06-43fb-bb3f-00c8830493b4-openshift-service-ca-bundle\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:13.655071 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.654918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:13.655071 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:13.655018 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e7803-serving-cert: secret "switch-graph-e7803-serving-cert" not found Apr 22 19:18:13.655154 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:13.655077 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls podName:cf542cf9-fe06-43fb-bb3f-00c8830493b4 nodeName:}" failed. No retries permitted until 2026-04-22 19:18:14.155058494 +0000 UTC m=+1881.679722023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls") pod "switch-graph-e7803-7c4596f696-44m5s" (UID: "cf542cf9-fe06-43fb-bb3f-00c8830493b4") : secret "switch-graph-e7803-serving-cert" not found Apr 22 19:18:13.655537 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:13.655518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf542cf9-fe06-43fb-bb3f-00c8830493b4-openshift-service-ca-bundle\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:14.159262 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:14.159223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:14.161642 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:14.161618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls\") pod \"switch-graph-e7803-7c4596f696-44m5s\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:14.429380 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:14.429298 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:14.556116 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:14.556089 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s"] Apr 22 19:18:15.365345 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:15.365299 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:18:15.495551 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:15.495515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" event={"ID":"cf542cf9-fe06-43fb-bb3f-00c8830493b4","Type":"ContainerStarted","Data":"69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc"} Apr 22 19:18:15.495718 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:15.495559 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" event={"ID":"cf542cf9-fe06-43fb-bb3f-00c8830493b4","Type":"ContainerStarted","Data":"06dbf142e7d3c66d65c87ede1ccc92bf0fcfd03c6904dc1624e54db37a1931f1"} Apr 22 19:18:15.495718 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:15.495586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:15.511918 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:15.511872 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podStartSLOduration=2.51185755 podStartE2EDuration="2.51185755s" podCreationTimestamp="2026-04-22 19:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:18:15.510163281 +0000 UTC m=+1883.034826834" watchObservedRunningTime="2026-04-22 19:18:15.51185755 +0000 UTC m=+1883.036521101" Apr 22 19:18:16.417847 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:16.417808 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 22 19:18:17.732428 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:18:17.732395 2577 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-conmon-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope/cpuset.cpus.effective": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-conmon-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope/cpuset.cpus.effective: no such device Apr 22 19:18:17.738488 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:17.738455 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-conmon-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:18:17.738614 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:17.738503 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-conmon-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:18:17.739469 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:17.739441 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-conmon-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:18:17.743053 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:17.742988 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16177f8_1c92_4b97_8749_13a66888dea4.slice/crio-conmon-a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:18:17.886130 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:17.886101 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:18:17.992249 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:17.992160 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b16177f8-1c92-4b97-8749-13a66888dea4-openshift-service-ca-bundle\") pod \"b16177f8-1c92-4b97-8749-13a66888dea4\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " Apr 22 19:18:17.992431 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:17.992305 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls\") pod \"b16177f8-1c92-4b97-8749-13a66888dea4\" (UID: \"b16177f8-1c92-4b97-8749-13a66888dea4\") " Apr 22 19:18:17.992557 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:17.992535 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b16177f8-1c92-4b97-8749-13a66888dea4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b16177f8-1c92-4b97-8749-13a66888dea4" (UID: "b16177f8-1c92-4b97-8749-13a66888dea4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:18:17.994376 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:17.994346 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b16177f8-1c92-4b97-8749-13a66888dea4" (UID: "b16177f8-1c92-4b97-8749-13a66888dea4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:18:18.092909 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.092877 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b16177f8-1c92-4b97-8749-13a66888dea4-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:18:18.092909 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.092905 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b16177f8-1c92-4b97-8749-13a66888dea4-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:18:18.508357 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.508318 2577 generic.go:358] "Generic (PLEG): container finished" podID="b16177f8-1c92-4b97-8749-13a66888dea4" containerID="a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607" exitCode=0 Apr 22 19:18:18.508535 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.508375 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" Apr 22 19:18:18.508535 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.508406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" event={"ID":"b16177f8-1c92-4b97-8749-13a66888dea4","Type":"ContainerDied","Data":"a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607"} Apr 22 19:18:18.508535 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.508447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p" event={"ID":"b16177f8-1c92-4b97-8749-13a66888dea4","Type":"ContainerDied","Data":"5cd45f54d5b60e36c6d970a020392c1748a0cb0e8407d7175b1e15edd823e388"} Apr 22 19:18:18.508535 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.508463 2577 scope.go:117] "RemoveContainer" containerID="a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607" Apr 22 19:18:18.516934 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.516916 2577 scope.go:117] "RemoveContainer" containerID="a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607" Apr 22 19:18:18.517166 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:18:18.517147 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607\": container with ID starting with a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607 not found: ID does not exist" containerID="a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607" Apr 22 19:18:18.517218 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.517179 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607"} err="failed to get container status \"a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607\": rpc error: code = NotFound desc = could not find container \"a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607\": container with ID starting with a227a4895b2c7c97a0166c7fddef1996ff2f1a6c91eb6d020fb9bfba6f99a607 not found: ID does not exist" Apr 22 19:18:18.543287 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.531201 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p"] Apr 22 19:18:18.549568 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.549532 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-c8908-6bc4bd4ff9-82b7p"] Apr 22 19:18:18.991085 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:18.991054 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" path="/var/lib/kubelet/pods/b16177f8-1c92-4b97-8749-13a66888dea4/volumes" Apr 22 19:18:21.504903 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:21.504875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:18:26.418597 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:26.418557 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 22 19:18:36.419085 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:36.419051 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:18:47.926668 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.926633 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh"] Apr 22 19:18:47.927039 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.927001 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" Apr 22 19:18:47.927039 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.927012 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" Apr 22 19:18:47.927115 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.927090 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b16177f8-1c92-4b97-8749-13a66888dea4" containerName="splitter-graph-c8908" Apr 22 19:18:47.930221 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.930201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:47.932885 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.932859 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-352f4-kube-rbac-proxy-sar-config\"" Apr 22 19:18:47.932885 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.932860 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-352f4-serving-cert\"" Apr 22 19:18:47.937031 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:47.937002 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh"] Apr 22 19:18:48.068457 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.068424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16389b15-0804-49b8-9ef6-dc978cf25f78-openshift-service-ca-bundle\") pod \"splitter-graph-352f4-56f7c7d574-mbshh\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.068639 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.068472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16389b15-0804-49b8-9ef6-dc978cf25f78-proxy-tls\") pod \"splitter-graph-352f4-56f7c7d574-mbshh\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.169889 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.169857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16389b15-0804-49b8-9ef6-dc978cf25f78-openshift-service-ca-bundle\") pod \"splitter-graph-352f4-56f7c7d574-mbshh\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.170085 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.169902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16389b15-0804-49b8-9ef6-dc978cf25f78-proxy-tls\") pod \"splitter-graph-352f4-56f7c7d574-mbshh\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.170468 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.170445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16389b15-0804-49b8-9ef6-dc978cf25f78-openshift-service-ca-bundle\") pod \"splitter-graph-352f4-56f7c7d574-mbshh\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.172263 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.172236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16389b15-0804-49b8-9ef6-dc978cf25f78-proxy-tls\") pod \"splitter-graph-352f4-56f7c7d574-mbshh\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.242891 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.242865 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.365538 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.365510 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh"] Apr 22 19:18:48.367701 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:18:48.367670 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16389b15_0804_49b8_9ef6_dc978cf25f78.slice/crio-637f048e26b9048c9c22992ab10100d44d4e68c94b7e4448710107be77148b8c WatchSource:0}: Error finding container 637f048e26b9048c9c22992ab10100d44d4e68c94b7e4448710107be77148b8c: Status 404 returned error can't find the container with id 637f048e26b9048c9c22992ab10100d44d4e68c94b7e4448710107be77148b8c Apr 22 19:18:48.619854 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.619769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" event={"ID":"16389b15-0804-49b8-9ef6-dc978cf25f78","Type":"ContainerStarted","Data":"7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d"} Apr 22 19:18:48.619854 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.619804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" event={"ID":"16389b15-0804-49b8-9ef6-dc978cf25f78","Type":"ContainerStarted","Data":"637f048e26b9048c9c22992ab10100d44d4e68c94b7e4448710107be77148b8c"} Apr 22 19:18:48.620051 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.619898 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:18:48.636942 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:48.636893 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podStartSLOduration=1.636879288 podStartE2EDuration="1.636879288s" podCreationTimestamp="2026-04-22 19:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:18:48.635520389 +0000 UTC m=+1916.160183943" watchObservedRunningTime="2026-04-22 19:18:48.636879288 +0000 UTC m=+1916.161542838" Apr 22 19:18:54.629214 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:18:54.629184 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:21:53.031675 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:21:53.031647 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:21:53.035136 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:21:53.035114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:26:53.055803 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:26:53.055687 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:26:53.061044 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:26:53.061024 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:27:02.649015 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:02.648979 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh"] Apr 22 19:27:02.649593 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:02.649228 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" containerID="cri-o://7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d" gracePeriod=30 Apr 22 19:27:02.771255 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:02.771218 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv"] Apr 22 19:27:02.771646 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:02.771585 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" containerID="cri-o://cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d" gracePeriod=30 Apr 22 19:27:02.771646 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:02.771626 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kube-rbac-proxy" containerID="cri-o://317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0" gracePeriod=30 Apr 22 19:27:03.303686 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:03.303652 2577 generic.go:358] "Generic (PLEG): container finished" podID="8317a774-5ed9-452a-8edc-6a717007cb36" containerID="317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0" exitCode=2 Apr 22 19:27:03.303878 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:03.303708 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" event={"ID":"8317a774-5ed9-452a-8edc-6a717007cb36","Type":"ContainerDied","Data":"317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0"} Apr 22 19:27:04.627505 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:04.627466 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:27:05.916247 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:05.916223 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:27:06.004457 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.004428 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8317a774-5ed9-452a-8edc-6a717007cb36-error-404-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"8317a774-5ed9-452a-8edc-6a717007cb36\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " Apr 22 19:27:06.004637 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.004474 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8317a774-5ed9-452a-8edc-6a717007cb36-proxy-tls\") pod \"8317a774-5ed9-452a-8edc-6a717007cb36\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " Apr 22 19:27:06.004637 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.004505 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrc7n\" (UniqueName: \"kubernetes.io/projected/8317a774-5ed9-452a-8edc-6a717007cb36-kube-api-access-jrc7n\") pod \"8317a774-5ed9-452a-8edc-6a717007cb36\" (UID: \"8317a774-5ed9-452a-8edc-6a717007cb36\") " Apr 22 19:27:06.004845 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.004823 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8317a774-5ed9-452a-8edc-6a717007cb36-error-404-isvc-352f4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-352f4-kube-rbac-proxy-sar-config") pod "8317a774-5ed9-452a-8edc-6a717007cb36" (UID: "8317a774-5ed9-452a-8edc-6a717007cb36"). InnerVolumeSpecName "error-404-isvc-352f4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:06.006683 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.006658 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8317a774-5ed9-452a-8edc-6a717007cb36-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8317a774-5ed9-452a-8edc-6a717007cb36" (UID: "8317a774-5ed9-452a-8edc-6a717007cb36"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:06.006784 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.006658 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8317a774-5ed9-452a-8edc-6a717007cb36-kube-api-access-jrc7n" (OuterVolumeSpecName: "kube-api-access-jrc7n") pod "8317a774-5ed9-452a-8edc-6a717007cb36" (UID: "8317a774-5ed9-452a-8edc-6a717007cb36"). InnerVolumeSpecName "kube-api-access-jrc7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:06.105628 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.105590 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8317a774-5ed9-452a-8edc-6a717007cb36-error-404-isvc-352f4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:27:06.105628 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.105622 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8317a774-5ed9-452a-8edc-6a717007cb36-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:27:06.105628 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.105633 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrc7n\" (UniqueName: \"kubernetes.io/projected/8317a774-5ed9-452a-8edc-6a717007cb36-kube-api-access-jrc7n\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:27:06.314731 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.314639 2577 generic.go:358] "Generic (PLEG): container finished" podID="8317a774-5ed9-452a-8edc-6a717007cb36" containerID="cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d" exitCode=0 Apr 22 19:27:06.314731 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.314722 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" Apr 22 19:27:06.314933 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.314720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" event={"ID":"8317a774-5ed9-452a-8edc-6a717007cb36","Type":"ContainerDied","Data":"cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d"} Apr 22 19:27:06.314933 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.314775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv" event={"ID":"8317a774-5ed9-452a-8edc-6a717007cb36","Type":"ContainerDied","Data":"0597da79fadff8d28eb82124d9596d654594acd81137f949f43cd27efd41ff5d"} Apr 22 19:27:06.314933 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.314791 2577 scope.go:117] "RemoveContainer" containerID="317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0" Apr 22 19:27:06.323132 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.323117 2577 scope.go:117] "RemoveContainer" containerID="cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d" Apr 22 19:27:06.330454 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.330435 2577 scope.go:117] "RemoveContainer" containerID="317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0" Apr 22 19:27:06.330709 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:27:06.330690 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0\": container with ID starting with 317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0 not found: ID does not exist" containerID="317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0" Apr 22 19:27:06.330760 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.330717 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0"} err="failed to get container status \"317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0\": rpc error: code = NotFound desc = could not find container \"317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0\": container with ID starting with 317c6fa77e8ec09a0d496bb3361cae7e3a478d2c5230090d3a0d82519b85b9a0 not found: ID does not exist" Apr 22 19:27:06.330760 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.330736 2577 scope.go:117] "RemoveContainer" containerID="cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d" Apr 22 19:27:06.330940 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:27:06.330921 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d\": container with ID starting with cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d not found: ID does not exist" containerID="cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d" Apr 22 19:27:06.330982 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.330945 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d"} err="failed to get container status \"cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d\": rpc error: code = NotFound desc = could not find container \"cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d\": container with ID starting with cdeb9f9f300a8a28fbaebd6c1ad9d1f3f93106a7fd0eb72bcba178c0737e454d not found: ID does not exist" Apr 22 19:27:06.335144 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.335122 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv"] Apr 22 19:27:06.339186 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.339164 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-352f4-predictor-6f5f47949-9nfnv"] Apr 22 19:27:06.990827 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:06.990792 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" path="/var/lib/kubelet/pods/8317a774-5ed9-452a-8edc-6a717007cb36/volumes" Apr 22 19:27:09.627492 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:09.627392 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:27:14.627302 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:14.627247 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:27:14.627684 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:14.627365 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:27:19.627361 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:19.627316 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:27:24.627123 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:24.627084 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:27:29.627951 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:29.627910 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:27:32.792825 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.792802 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:27:32.831525 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.831500 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16389b15-0804-49b8-9ef6-dc978cf25f78-proxy-tls\") pod \"16389b15-0804-49b8-9ef6-dc978cf25f78\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " Apr 22 19:27:32.831704 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.831532 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16389b15-0804-49b8-9ef6-dc978cf25f78-openshift-service-ca-bundle\") pod \"16389b15-0804-49b8-9ef6-dc978cf25f78\" (UID: \"16389b15-0804-49b8-9ef6-dc978cf25f78\") " Apr 22 19:27:32.831923 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.831901 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16389b15-0804-49b8-9ef6-dc978cf25f78-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "16389b15-0804-49b8-9ef6-dc978cf25f78" (UID: "16389b15-0804-49b8-9ef6-dc978cf25f78"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:32.833613 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.833588 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16389b15-0804-49b8-9ef6-dc978cf25f78-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "16389b15-0804-49b8-9ef6-dc978cf25f78" (UID: "16389b15-0804-49b8-9ef6-dc978cf25f78"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:32.933130 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.933049 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16389b15-0804-49b8-9ef6-dc978cf25f78-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:27:32.933130 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:32.933078 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16389b15-0804-49b8-9ef6-dc978cf25f78-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:27:33.405080 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.405047 2577 generic.go:358] "Generic (PLEG): container finished" podID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerID="7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d" exitCode=0 Apr 22 19:27:33.405294 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.405088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" event={"ID":"16389b15-0804-49b8-9ef6-dc978cf25f78","Type":"ContainerDied","Data":"7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d"} Apr 22 19:27:33.405294 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.405108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" event={"ID":"16389b15-0804-49b8-9ef6-dc978cf25f78","Type":"ContainerDied","Data":"637f048e26b9048c9c22992ab10100d44d4e68c94b7e4448710107be77148b8c"} Apr 22 19:27:33.405294 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.405117 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh" Apr 22 19:27:33.405294 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.405131 2577 scope.go:117] "RemoveContainer" containerID="7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d" Apr 22 19:27:33.413185 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.413150 2577 scope.go:117] "RemoveContainer" containerID="7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d" Apr 22 19:27:33.413464 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:27:33.413442 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d\": container with ID starting with 7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d not found: ID does not exist" containerID="7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d" Apr 22 19:27:33.413528 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.413472 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d"} err="failed to get container status \"7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d\": rpc error: code = NotFound desc = could not find container \"7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d\": container with ID starting with 7ca39ce4d0ddcb43db65111659ac18b8fbecd4c1a432919706943b5a74785b8d not found: ID does not exist" Apr 22 19:27:33.420996 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.420974 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh"] Apr 22 19:27:33.423780 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:33.423758 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-352f4-56f7c7d574-mbshh"] Apr 22 19:27:34.990691 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:27:34.990657 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" path="/var/lib/kubelet/pods/16389b15-0804-49b8-9ef6-dc978cf25f78/volumes" Apr 22 19:31:53.082801 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:31:53.082683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:31:53.088478 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:31:53.088458 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:34:32.962673 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:32.962641 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s"] Apr 22 19:34:32.963214 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:32.962914 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" containerID="cri-o://69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc" gracePeriod=30 Apr 22 19:34:33.147114 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:33.147079 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl"] Apr 22 19:34:33.147412 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:33.147373 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" containerID="cri-o://08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681" gracePeriod=30 Apr 22 19:34:33.147515 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:33.147419 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kube-rbac-proxy" containerID="cri-o://d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0" gracePeriod=30 Apr 22 19:34:33.814479 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:33.814442 2577 generic.go:358] "Generic (PLEG): container finished" podID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerID="d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0" exitCode=2 Apr 22 19:34:33.814654 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:33.814497 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" event={"ID":"55f046b3-38f6-4451-b8a1-cf7b0bdfc432","Type":"ContainerDied","Data":"d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0"} Apr 22 19:34:36.290223 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.290198 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:34:36.388032 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.387954 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls\") pod \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " Apr 22 19:34:36.388032 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.388013 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-error-404-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " Apr 22 19:34:36.388218 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.388084 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9nxg\" (UniqueName: \"kubernetes.io/projected/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-kube-api-access-q9nxg\") pod \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\" (UID: \"55f046b3-38f6-4451-b8a1-cf7b0bdfc432\") " Apr 22 19:34:36.388546 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.388511 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-error-404-isvc-e7803-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e7803-kube-rbac-proxy-sar-config") pod "55f046b3-38f6-4451-b8a1-cf7b0bdfc432" (UID: "55f046b3-38f6-4451-b8a1-cf7b0bdfc432"). InnerVolumeSpecName "error-404-isvc-e7803-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:36.390245 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.390214 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "55f046b3-38f6-4451-b8a1-cf7b0bdfc432" (UID: "55f046b3-38f6-4451-b8a1-cf7b0bdfc432"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:34:36.390245 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.390218 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-kube-api-access-q9nxg" (OuterVolumeSpecName: "kube-api-access-q9nxg") pod "55f046b3-38f6-4451-b8a1-cf7b0bdfc432" (UID: "55f046b3-38f6-4451-b8a1-cf7b0bdfc432"). InnerVolumeSpecName "kube-api-access-q9nxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:34:36.488921 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.488872 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9nxg\" (UniqueName: \"kubernetes.io/projected/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-kube-api-access-q9nxg\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:34:36.488921 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.488916 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:34:36.488921 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.488930 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55f046b3-38f6-4451-b8a1-cf7b0bdfc432-error-404-isvc-e7803-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:34:36.503670 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.503632 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:36.826119 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.826087 2577 generic.go:358] "Generic (PLEG): container finished" podID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerID="08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681" exitCode=0 Apr 22 19:34:36.826364 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.826158 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" event={"ID":"55f046b3-38f6-4451-b8a1-cf7b0bdfc432","Type":"ContainerDied","Data":"08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681"} Apr 22 19:34:36.826364 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.826195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" event={"ID":"55f046b3-38f6-4451-b8a1-cf7b0bdfc432","Type":"ContainerDied","Data":"9b22cf778a511912fadbc4247be69db31a1b6290a467f495f08d43ab33bb41c4"} Apr 22 19:34:36.826364 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.826214 2577 scope.go:117] "RemoveContainer" containerID="d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0" Apr 22 19:34:36.826364 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.826219 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl" Apr 22 19:34:36.835344 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.835320 2577 scope.go:117] "RemoveContainer" containerID="08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681" Apr 22 19:34:36.842393 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.842374 2577 scope.go:117] "RemoveContainer" containerID="d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0" Apr 22 19:34:36.842632 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:34:36.842612 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0\": container with ID starting with d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0 not found: ID does not exist" containerID="d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0" Apr 22 19:34:36.842698 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.842640 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0"} err="failed to get container status \"d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0\": rpc error: code = NotFound desc = could not find container \"d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0\": container with ID starting with d6c58a19ec51930e60b1ddf05719929a065be6062d428c38f71a4700d605e4d0 not found: ID does not exist" Apr 22 19:34:36.842698 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.842658 2577 scope.go:117] "RemoveContainer" containerID="08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681" Apr 22 19:34:36.842910 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:34:36.842893 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681\": container with ID starting with 08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681 not found: ID does not exist" containerID="08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681" Apr 22 19:34:36.842954 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.842917 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681"} err="failed to get container status \"08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681\": rpc error: code = NotFound desc = could not find container \"08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681\": container with ID starting with 08ba35a8e183ae3549de2353197788ada33b088cb6fb96b7606b5547bbb45681 not found: ID does not exist" Apr 22 19:34:36.847894 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.847872 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl"] Apr 22 19:34:36.850936 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.850916 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e7803-predictor-6f9984fcb7-mv7xl"] Apr 22 19:34:36.991716 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:36.991681 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" path="/var/lib/kubelet/pods/55f046b3-38f6-4451-b8a1-cf7b0bdfc432/volumes" Apr 22 19:34:41.504029 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:41.503939 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:46.504205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:46.504157 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:46.504609 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:46.504310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:34:48.872448 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:48.872420 2577 ???:1] "http2: server: error reading preface from client 10.0.143.56:43458: read tcp 10.0.143.56:10250->10.0.143.56:43458: read: connection reset by peer" Apr 22 19:34:48.875793 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:48.875769 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:49.664240 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:49.664213 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:50.455091 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:50.455044 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:51.214827 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:51.214795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:51.504410 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:51.504315 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:51.978970 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:51.978942 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:52.738476 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:52.738439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:53.512207 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:53.512177 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:54.285435 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:54.285395 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:55.070864 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:55.070828 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:55.882146 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:55.882111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:56.503707 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:56.503667 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:56.726580 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:56.726545 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:34:57.513504 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:34:57.513473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e7803-7c4596f696-44m5s_cf542cf9-fe06-43fb-bb3f-00c8830493b4/switch-graph-e7803/0.log" Apr 22 19:35:01.503841 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:01.503802 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:02.611631 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:02.611604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mlrzp_31f9f33a-79c4-425d-991c-7eb42f160268/global-pull-secret-syncer/0.log" Apr 22 19:35:02.732879 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:02.732852 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zttd6_d44f31c4-687d-42a9-b6c4-230bd1f85d32/konnectivity-agent/0.log" Apr 22 19:35:02.802696 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:02.802665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-56.ec2.internal_00b417b51c5d26c97b6e66b7df9d6ed9/haproxy/0.log" Apr 22 19:35:03.108190 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.108169 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:35:03.223397 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.223307 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls\") pod \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " Apr 22 19:35:03.223544 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.223411 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf542cf9-fe06-43fb-bb3f-00c8830493b4-openshift-service-ca-bundle\") pod \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\" (UID: \"cf542cf9-fe06-43fb-bb3f-00c8830493b4\") " Apr 22 19:35:03.223772 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.223748 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf542cf9-fe06-43fb-bb3f-00c8830493b4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cf542cf9-fe06-43fb-bb3f-00c8830493b4" (UID: "cf542cf9-fe06-43fb-bb3f-00c8830493b4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:35:03.225235 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.225214 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cf542cf9-fe06-43fb-bb3f-00c8830493b4" (UID: "cf542cf9-fe06-43fb-bb3f-00c8830493b4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:35:03.324663 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.324608 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf542cf9-fe06-43fb-bb3f-00c8830493b4-openshift-service-ca-bundle\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:35:03.324663 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.324655 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf542cf9-fe06-43fb-bb3f-00c8830493b4-proxy-tls\") on node \"ip-10-0-143-56.ec2.internal\" DevicePath \"\"" Apr 22 19:35:03.918033 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.918000 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerID="69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc" exitCode=0 Apr 22 19:35:03.918585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.918064 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" Apr 22 19:35:03.918585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.918090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" event={"ID":"cf542cf9-fe06-43fb-bb3f-00c8830493b4","Type":"ContainerDied","Data":"69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc"} Apr 22 19:35:03.918585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.918134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s" event={"ID":"cf542cf9-fe06-43fb-bb3f-00c8830493b4","Type":"ContainerDied","Data":"06dbf142e7d3c66d65c87ede1ccc92bf0fcfd03c6904dc1624e54db37a1931f1"} Apr 22 19:35:03.918585 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.918158 2577 scope.go:117] "RemoveContainer" containerID="69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc" Apr 22 19:35:03.926679 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.926662 2577 scope.go:117] "RemoveContainer" containerID="69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc" Apr 22 19:35:03.926915 ip-10-0-143-56 kubenswrapper[2577]: E0422 19:35:03.926895 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc\": container with ID starting with 69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc not found: ID does not exist" containerID="69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc" Apr 22 19:35:03.926966 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.926924 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc"} err="failed to get container status \"69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc\": rpc error: code = NotFound desc = could not find container \"69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc\": container with ID starting with 69189dfa824a05882d9e6f4f84ce63b897965b9ec9a83c0747535c26cae494dc not found: ID does not exist" Apr 22 19:35:03.939983 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.939958 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s"] Apr 22 19:35:03.943216 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:03.943186 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e7803-7c4596f696-44m5s"] Apr 22 19:35:04.991312 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:04.991258 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" path="/var/lib/kubelet/pods/cf542cf9-fe06-43fb-bb3f-00c8830493b4/volumes" Apr 22 19:35:05.701118 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.701090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/alertmanager/0.log" Apr 22 19:35:05.730962 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.730938 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/config-reloader/0.log" Apr 22 19:35:05.757866 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.757841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/kube-rbac-proxy-web/0.log" Apr 22 19:35:05.786188 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.786166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/kube-rbac-proxy/0.log" Apr 22 19:35:05.808453 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.808431 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/kube-rbac-proxy-metric/0.log" Apr 22 19:35:05.842118 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.842096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/prom-label-proxy/0.log" Apr 22 19:35:05.877110 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.877085 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_15d9b432-370b-4e02-af61-f3f0163e829c/init-config-reloader/0.log" Apr 22 19:35:05.926097 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.926070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-7fshx_76eb6fb3-6155-4b2e-86d2-e26d23bbb6f3/cluster-monitoring-operator/0.log" Apr 22 19:35:05.949472 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.949449 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-chvkl_278d1366-e4d5-4510-97e7-454f852e755e/kube-state-metrics/0.log" Apr 22 19:35:05.979020 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:05.978992 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-chvkl_278d1366-e4d5-4510-97e7-454f852e755e/kube-rbac-proxy-main/0.log" Apr 22 19:35:06.004255 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.004229 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-chvkl_278d1366-e4d5-4510-97e7-454f852e755e/kube-rbac-proxy-self/0.log" Apr 22 19:35:06.035376 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.035356 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5bc5566b49-nwt7t_5443706c-f0f0-4036-8c01-0faa5b4b7f57/metrics-server/0.log" Apr 22 19:35:06.177875 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.177845 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8mkqk_21175012-ee46-4d06-8d03-22a7d3555566/node-exporter/0.log" Apr 22 19:35:06.200411 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.200388 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8mkqk_21175012-ee46-4d06-8d03-22a7d3555566/kube-rbac-proxy/0.log" Apr 22 19:35:06.222403 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.222379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8mkqk_21175012-ee46-4d06-8d03-22a7d3555566/init-textfile/0.log" Apr 22 19:35:06.637280 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.637239 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d58b4f4bd-fgwqr_f2809948-8435-496d-ae4b-0791482f79c7/telemeter-client/0.log" Apr 22 19:35:06.658316 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.658291 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d58b4f4bd-fgwqr_f2809948-8435-496d-ae4b-0791482f79c7/reload/0.log" Apr 22 19:35:06.680088 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:06.680067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d58b4f4bd-fgwqr_f2809948-8435-496d-ae4b-0791482f79c7/kube-rbac-proxy/0.log" Apr 22 19:35:08.446284 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:08.446246 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/2.log" Apr 22 19:35:08.450249 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:08.450227 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4kjm_407ba526-67b3-4fe5-9bc6-2c9894fb034f/console-operator/3.log" Apr 22 19:35:08.841433 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:08.841405 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764c74bbf9-x9t8c_ab239c5f-2846-4e56-9332-24bbdc368b27/console/0.log" Apr 22 19:35:08.873938 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:08.873907 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-kj98m_863935b8-e8a7-4f32-aa10-ab772cc335b8/download-server/0.log" Apr 22 19:35:09.252596 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.252571 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-p8mkx_86d47618-2f32-40db-a2c3-e3a19e106b16/volume-data-source-validator/0.log" Apr 22 19:35:09.917833 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.917802 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4"] Apr 22 19:35:09.918200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918157 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kube-rbac-proxy" Apr 22 19:35:09.918200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918172 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kube-rbac-proxy" Apr 22 19:35:09.918200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918186 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" Apr 22 19:35:09.918200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918191 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" Apr 22 19:35:09.918200 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918199 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918205 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918217 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918222 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918228 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918235 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918249 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kube-rbac-proxy" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918254 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kube-rbac-proxy" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918331 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kube-rbac-proxy" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918340 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="16389b15-0804-49b8-9ef6-dc978cf25f78" containerName="splitter-graph-352f4" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918348 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kube-rbac-proxy" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918356 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf542cf9-fe06-43fb-bb3f-00c8830493b4" containerName="switch-graph-e7803" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918363 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8317a774-5ed9-452a-8edc-6a717007cb36" containerName="kserve-container" Apr 22 19:35:09.918395 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.918370 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="55f046b3-38f6-4451-b8a1-cf7b0bdfc432" containerName="kserve-container" Apr 22 19:35:09.920810 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.920789 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-68s2k_130369e7-d304-4500-9ad6-18b8f2f4e731/dns/0.log" Apr 22 19:35:09.921519 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.921500 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:09.923966 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.923948 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dp56\"/\"openshift-service-ca.crt\"" Apr 22 19:35:09.925520 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.925496 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6dp56\"/\"default-dockercfg-dcnhf\"" Apr 22 19:35:09.925644 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.925578 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dp56\"/\"kube-root-ca.crt\"" Apr 22 19:35:09.927922 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.927900 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4"] Apr 22 19:35:09.949331 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:09.949302 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-68s2k_130369e7-d304-4500-9ad6-18b8f2f4e731/kube-rbac-proxy/0.log" Apr 22 19:35:10.010035 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.010009 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k85gs_77f410ec-92d3-4e11-871d-3bd6da0e0d1f/dns-node-resolver/0.log" Apr 22 19:35:10.083069 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.083039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-lib-modules\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.083196 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.083071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-podres\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.083196 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.083151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-sys\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.083314 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.083220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-proc\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.083314 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.083244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4ts\" (UniqueName: \"kubernetes.io/projected/101d67b7-3f89-4e0c-9c20-908c4f76491e-kube-api-access-9x4ts\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.183919 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.183824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-proc\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.183919 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.183874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4ts\" (UniqueName: \"kubernetes.io/projected/101d67b7-3f89-4e0c-9c20-908c4f76491e-kube-api-access-9x4ts\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.183919 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.183915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-lib-modules\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.184205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.183933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-podres\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.184205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.183950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-proc\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.184205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.183974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-sys\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.184205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.184037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-sys\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.184205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.184079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-podres\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.184205 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.184082 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/101d67b7-3f89-4e0c-9c20-908c4f76491e-lib-modules\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.191754 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.191734 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4ts\" (UniqueName: \"kubernetes.io/projected/101d67b7-3f89-4e0c-9c20-908c4f76491e-kube-api-access-9x4ts\") pod \"perf-node-gather-daemonset-xlzr4\" (UID: \"101d67b7-3f89-4e0c-9c20-908c4f76491e\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.232375 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.232338 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.350721 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.350655 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4"] Apr 22 19:35:10.352973 ip-10-0-143-56 kubenswrapper[2577]: W0422 19:35:10.352950 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod101d67b7_3f89_4e0c_9c20_908c4f76491e.slice/crio-20f96f65ed14db62a4a82bada1cbe21ca71f6c842f0540ed0afb8cc03c744481 WatchSource:0}: Error finding container 20f96f65ed14db62a4a82bada1cbe21ca71f6c842f0540ed0afb8cc03c744481: Status 404 returned error can't find the container with id 20f96f65ed14db62a4a82bada1cbe21ca71f6c842f0540ed0afb8cc03c744481 Apr 22 19:35:10.354517 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.354501 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:35:10.470503 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.470478 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-597d994cdc-cvzrg_00ddfe65-9d38-4b62-a7fc-877af5eec212/registry/0.log" Apr 22 19:35:10.490355 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.490329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5t6hp_023c27e3-86e9-4182-bf8f-c7b6197bc958/node-ca/0.log" Apr 22 19:35:10.943820 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.943732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" event={"ID":"101d67b7-3f89-4e0c-9c20-908c4f76491e","Type":"ContainerStarted","Data":"86402a134160d5324f52d433ac9919e53ecc43289ca9c073a3e31ef8ed62a26d"} Apr 22 19:35:10.943820 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.943771 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" event={"ID":"101d67b7-3f89-4e0c-9c20-908c4f76491e","Type":"ContainerStarted","Data":"20f96f65ed14db62a4a82bada1cbe21ca71f6c842f0540ed0afb8cc03c744481"} Apr 22 19:35:10.943820 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.943807 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:10.957972 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:10.957931 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" podStartSLOduration=1.9579183900000001 podStartE2EDuration="1.95791839s" podCreationTimestamp="2026-04-22 19:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:35:10.957876549 +0000 UTC m=+2898.482540102" watchObservedRunningTime="2026-04-22 19:35:10.95791839 +0000 UTC m=+2898.482581919" Apr 22 19:35:11.220678 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:11.220648 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-9c6c9466d-5zwxc_a14c1def-cf59-4fe4-a62f-26d8cc86cd77/router/0.log" Apr 22 19:35:11.563181 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:11.563105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qnb4b_be593d71-f465-4468-8034-246bf4c51e73/serve-healthcheck-canary/0.log" Apr 22 19:35:11.968315 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:11.968285 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gjnl_5147e527-afd1-402c-995b-814eebf64541/kube-rbac-proxy/0.log" Apr 22 19:35:11.982828 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:11.982785 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gjnl_5147e527-afd1-402c-995b-814eebf64541/exporter/0.log" Apr 22 19:35:12.002518 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:12.002494 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gjnl_5147e527-afd1-402c-995b-814eebf64541/extractor/0.log" Apr 22 19:35:13.971849 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:13.971819 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-wpz6l_91627a29-eef4-43fa-b35c-2b84ba4baf93/manager/0.log" Apr 22 19:35:14.226884 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:14.226808 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-kv8pn_9c01586d-bde7-474a-9ba6-df24324623f0/manager/0.log" Apr 22 19:35:16.957549 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:16.957515 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-xlzr4" Apr 22 19:35:17.941702 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:17.941655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gkbcc_20a2b785-7a65-4033-ae4d-0275a248aec8/kube-storage-version-migrator-operator/1.log" Apr 22 19:35:17.942531 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:17.942506 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gkbcc_20a2b785-7a65-4033-ae4d-0275a248aec8/kube-storage-version-migrator-operator/0.log" Apr 22 19:35:18.905654 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:18.905624 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7frpv_7b10df83-fc14-4a92-9ad2-800fbd71b62e/kube-multus/0.log" Apr 22 19:35:19.137953 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.137924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/kube-multus-additional-cni-plugins/0.log" Apr 22 19:35:19.164758 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.164733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/egress-router-binary-copy/0.log" Apr 22 19:35:19.184743 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.184718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/cni-plugins/0.log" Apr 22 19:35:19.207540 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.207516 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/bond-cni-plugin/0.log" Apr 22 19:35:19.228223 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.228202 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/routeoverride-cni/0.log" Apr 22 19:35:19.248366 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.248341 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/whereabouts-cni-bincopy/0.log" Apr 22 19:35:19.268978 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.268956 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hm5p8_b6c4f00f-22a0-4f0d-bbe0-8b9038175a35/whereabouts-cni/0.log" Apr 22 19:35:19.521128 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.521100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4q2cb_c33a8222-6663-4971-9e27-d05681becacf/network-metrics-daemon/0.log" Apr 22 19:35:19.538592 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:19.538572 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4q2cb_c33a8222-6663-4971-9e27-d05681becacf/kube-rbac-proxy/0.log" Apr 22 19:35:20.327583 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.327550 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/ovn-controller/0.log" Apr 22 19:35:20.356529 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.356499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/ovn-acl-logging/0.log" Apr 22 19:35:20.373608 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.373583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/kube-rbac-proxy-node/0.log" Apr 22 19:35:20.394409 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.394379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:35:20.414043 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.414018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/northd/0.log" Apr 22 19:35:20.435185 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.435163 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/nbdb/0.log" Apr 22 19:35:20.454356 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.454334 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/sbdb/0.log" Apr 22 19:35:20.550703 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:20.550670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7zbwn_e12756ff-2896-467a-b08f-4d4ca991e872/ovnkube-controller/0.log" Apr 22 19:35:22.187771 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:22.187742 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-qlzm7_4630660b-1003-4863-8be0-f42b5940db5c/check-endpoints/0.log" Apr 22 19:35:22.234221 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:22.234190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-h4b8s_14401ce0-56c3-41fc-9d81-5b7fae368b4c/network-check-target-container/0.log" Apr 22 19:35:23.120405 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:23.120376 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-n7nbq_72313447-1a66-4fcb-905e-c46123a74148/iptables-alerter/0.log" Apr 22 19:35:23.728100 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:23.728072 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4h9jv_30ee9556-6a92-4520-a931-8d8ab472a6b5/tuned/0.log" Apr 22 19:35:25.352465 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:25.352429 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-5xwp6_81219394-9b4e-4e9d-a98d-d0fd92f6277d/cluster-samples-operator/0.log" Apr 22 19:35:25.368319 ip-10-0-143-56 kubenswrapper[2577]: I0422 19:35:25.368292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-5xwp6_81219394-9b4e-4e9d-a98d-d0fd92f6277d/cluster-samples-operator-watch/0.log"