Apr 22 19:20:45.068446 ip-10-0-134-231 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:20:45.068459 ip-10-0-134-231 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:20:45.068469 ip-10-0-134-231 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:20:45.068803 ip-10-0-134-231 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:20:56.164197 ip-10-0-134-231 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:20:56.164212 ip-10-0-134-231 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 69430ce73f004dbf8af713e233ea7063 -- Apr 22 19:23:05.474927 ip-10-0-134-231 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.917269 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.925981 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.925993 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.925997 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926000 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926003 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926006 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926008 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.945578 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926013 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926017 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926020 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926023 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926026 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926029 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926032 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926035 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926038 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926041 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926043 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926046 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926049 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926052 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926055 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926060 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926063 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926077 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926081 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926084 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.946736 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926086 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926089 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926092 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926095 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926098 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926101 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926104 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926106 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926109 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926111 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926114 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926117 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926119 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926122 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926125 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926127 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926130 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926132 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926135 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926137 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.947284 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926141 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926143 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926146 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926149 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926151 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926154 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926157 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926160 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926163 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926166 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926169 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926174 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926177 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926180 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926184 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926188 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926191 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926193 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926196 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.947870 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926198 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926201 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926204 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926206 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926209 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926212 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926215 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926217 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926220 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926222 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926224 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926227 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926229 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926232 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926234 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926237 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926240 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926242 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926245 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926247 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.948403 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926598 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926603 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926606 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926609 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926612 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926615 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926618 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926621 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926624 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926626 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926630 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926632 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926635 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926638 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926640 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926643 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926646 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926648 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926651 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926653 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.948996 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926656 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926658 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926675 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926677 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926680 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926683 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926686 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926688 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926691 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926694 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926697 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926699 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926702 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926704 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926707 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926710 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926713 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926715 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926718 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926721 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.949648 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926723 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926726 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926728 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926731 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926733 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926736 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926739 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926741 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926744 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926748 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926751 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926753 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926756 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926759 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926761 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926764 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926766 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926769 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926771 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926774 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.950279 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926777 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926779 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926782 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926786 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926791 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926793 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926796 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926799 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926801 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926807 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926810 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926814 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926817 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926820 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926823 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926826 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926830 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926832 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926835 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.950867 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926838 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926840 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926843 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926849 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926852 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926855 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.926857 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926927 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926934 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926940 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926944 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926949 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926952 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926956 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926961 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926964 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926967 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926970 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926974 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926977 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926979 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926982 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926986 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:05.951382 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926989 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926992 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926995 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.926999 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927002 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927005 2572 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927008 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927011 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927015 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927018 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927022 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927025 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927029 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927032 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927035 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927038 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927041 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927045 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927048 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927051 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927054 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927057 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927060 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927064 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927067 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:05.952026 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927069 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927072 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927076 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927079 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927082 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927086 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927090 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927093 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927095 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927099 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927101 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927104 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927107 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927110 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927114 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927117 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927120 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927123 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927126 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927130 2572 flags.go:64] FLAG: --help="false" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927133 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-134-231.ec2.internal" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927137 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927140 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:05.952699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927143 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927147 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927150 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927153 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927156 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927158 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927161 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927164 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927167 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927170 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927173 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927176 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927179 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927181 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927185 2572 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927189 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927191 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927194 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927199 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927202 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927205 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927208 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927211 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927214 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:05.953321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927216 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927219 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927224 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927226 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927232 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927235 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927238 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927241 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927244 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927247 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927250 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927253 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927261 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927264 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927267 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927270 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927273 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927279 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927282 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927285 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927287 2572 flags.go:64] FLAG: --port="10250" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927290 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927293 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b93e342f5fb5c6bb" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927297 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:05.953951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927300 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927303 2572 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927306 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927309 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927313 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927316 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927318 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927322 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927325 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927328 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927331 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927334 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927337 2572 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927341 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927344 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927347 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927350 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927353 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927356 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927359 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927362 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927364 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927367 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927370 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927373 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927376 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:05.954588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927378 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927381 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927386 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927389 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927392 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927398 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927401 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927404 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927407 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927409 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927412 2572 flags.go:64] FLAG: --v="2" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927416 2572 flags.go:64] FLAG: --version="false" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927423 2572 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927428 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.927431 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927515 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927518 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927521 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927524 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927529 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927533 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927536 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927539 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.955507 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927542 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927545 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927548 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927550 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927553 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927555 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927558 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927561 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927563 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927566 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927568 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927571 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927573 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927576 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927580 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927582 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927585 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927587 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927590 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927593 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.956155 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927595 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927598 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927600 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927603 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927605 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927607 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927610 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927612 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927617 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927619 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927622 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927624 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927627 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927630 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927632 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927635 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927637 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927640 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927642 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927645 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.956783 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927647 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927650 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927653 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927656 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927658 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927673 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927677 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927680 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927682 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927685 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927687 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927690 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927693 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927695 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927698 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927701 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927704 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927708 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927711 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.957374 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927714 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927718 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927720 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927723 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927726 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927728 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927731 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927734 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927736 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927739 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927742 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927744 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927747 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927749 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927753 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927755 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927757 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927760 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.958248 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.927762 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.928394 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.934417 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.934431 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934495 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934501 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934504 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934508 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934511 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934514 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934522 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934525 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934528 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934531 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934534 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934536 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.959103 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934539 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934541 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934544 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934547 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934549 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934553 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934556 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934558 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934561 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934563 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934566 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934569 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934572 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934574 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934577 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934579 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934581 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934584 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934590 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934593 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.960136 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934596 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934600 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934603 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934606 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934609 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934612 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934614 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934617 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934619 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934622 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934624 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934627 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934629 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934632 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934635 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934637 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934640 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934642 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934645 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.960974 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934648 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934651 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934653 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934656 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934658 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934676 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934680 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934685 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934688 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934691 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934694 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934698 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934701 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934703 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934706 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934708 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934711 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934713 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934716 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.962023 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934719 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934721 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934724 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934726 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934729 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934732 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934734 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934737 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934741 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934744 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934746 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934749 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934751 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934754 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934757 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.962861 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934760 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.934765 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934862 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934867 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934870 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934873 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934876 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934878 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934881 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934884 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934888 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934891 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934894 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934896 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934899 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934901 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.963699 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934904 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934907 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934909 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934912 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934914 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934917 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934920 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934924 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934927 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934930 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934933 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934936 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934938 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934941 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934944 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934946 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934949 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934951 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934954 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.964513 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934956 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934960 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934963 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934966 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934969 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934972 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934974 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934978 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934980 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934983 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934985 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934988 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934990 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934993 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934995 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.934998 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935001 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935003 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935006 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.965781 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935008 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935011 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935013 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935017 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935019 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935021 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935024 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935027 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935029 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935032 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935034 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935037 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935040 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935042 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935045 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935047 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935050 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935052 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935055 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935057 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.966511 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935061 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935064 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935066 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935069 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935071 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935074 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935076 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935079 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935081 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935084 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935086 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935089 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935091 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:05.935094 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.935098 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:05.967086 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.935678 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.937544 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.938521 2572 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.938617 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.939201 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.965481 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.970435 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.985542 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.992450 2572 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.993910 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.998170 2572 fs.go:135] Filesystem UUIDs: map[0edd05cc-aab7-4150-959b-cba24f86db0f:/dev/nvme0n1p3 5766fd59-029e-4749-bed5-9a3a6d925dde:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:05.998185 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:06.198834 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.000280 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:06.025111 ip-10-0-134-231 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.003951 2572 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:06.001796222 +0000 UTC m=+0.411210726 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100872 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec260ace8bb67c35fee2d98885a041c6 SystemUUID:ec260ace-8bb6-7c35-fee2-d98885a041c6 BootID:69430ce7-3f00-4dbf-8af7-13e233ea7063 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c1:67:2a:cd:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c1:67:2a:cd:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:e1:9c:13:50:de Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.004057 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.004131 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.005256 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.005279 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-231.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.005427 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.005435 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.005448 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.005463 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.007257 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.007362 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.010216 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.010244 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.010255 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.010263 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.010272 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.011414 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:06.220794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.011429 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.014942 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.016416 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018577 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018592 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018598 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018604 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018614 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018623 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018638 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018645 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018651 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018657 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018683 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.018694 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.019571 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.019577 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.024302 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.024345 2572 server.go:1295] "Started kubelet" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.024472 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.024403 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.024539 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.024694 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.024731 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-231.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.024852 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.027400 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.028226 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.034201 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.034447 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.034790 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035589 2572 factory.go:55] Registering systemd factory Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035610 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035658 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.035698 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035732 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035744 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035831 2572 factory.go:153] Registering CRI-O factory Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035847 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035858 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035865 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035902 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035927 2572 factory.go:103] Registering Raw factory Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.035942 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.037143 2572 manager.go:319] Starting recovery of all containers Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.038054 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:06.221503 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.038178 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.038207 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-231.ec2.internal.18a8c435426b93af default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-231.ec2.internal,UID:ip-10-0-134-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-231.ec2.internal,},FirstTimestamp:2026-04-22 19:23:06.024317871 +0000 UTC m=+0.433732379,LastTimestamp:2026-04-22 19:23:06.024317871 +0000 UTC m=+0.433732379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-231.ec2.internal,}" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.045898 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-z62gg" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.052539 2572 manager.go:324] Recovery completed Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.052540 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-z62gg" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.053854 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.056854 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.059180 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.059203 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.059213 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.059589 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.059598 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.059611 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.061360 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-231.ec2.internal.18a8c435447fb37c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-231.ec2.internal,UID:ip-10-0-134-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-231.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-231.ec2.internal,},FirstTimestamp:2026-04-22 19:23:06.059191164 +0000 UTC m=+0.468605668,LastTimestamp:2026-04-22 19:23:06.059191164 +0000 UTC m=+0.468605668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-231.ec2.internal,}" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.063538 2572 policy_none.go:49] "None policy: Start" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.063551 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.063561 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104255 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.104278 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104287 2572 server.go:85] "Starting device plugin registration server" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104487 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104497 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104594 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104705 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.104719 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.105265 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.105296 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.164111 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.165240 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.165259 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.165274 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.165283 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.165318 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.167065 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.205230 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:06.222918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.206202 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:06.224049 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.206225 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:06.224049 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.206234 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:06.224049 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.206256 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.224049 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.218680 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.224049 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.218697 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-231.ec2.internal\": node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.233916 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.266338 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal"] Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.266388 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.267763 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.267796 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.267808 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270057 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270167 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270188 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270689 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270707 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270709 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270718 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270729 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.270742 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.272869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.272888 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.273537 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.273554 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.273564 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.295297 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-231.ec2.internal\" not found" node="ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.299564 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-231.ec2.internal\" not found" node="ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.334995 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.337220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e75517a8cb8f107b2cf6ea889751ce21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal\" (UID: \"e75517a8cb8f107b2cf6ea889751ce21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.337243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e75517a8cb8f107b2cf6ea889751ce21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal\" (UID: \"e75517a8cb8f107b2cf6ea889751ce21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.337260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/52cc10a5c9c1a625e58617dcad0f895c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-231.ec2.internal\" (UID: \"52cc10a5c9c1a625e58617dcad0f895c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.435757 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.437972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/52cc10a5c9c1a625e58617dcad0f895c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-231.ec2.internal\" (UID: \"52cc10a5c9c1a625e58617dcad0f895c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.469067 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.438008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e75517a8cb8f107b2cf6ea889751ce21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal\" (UID: \"e75517a8cb8f107b2cf6ea889751ce21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.470002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.438025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e75517a8cb8f107b2cf6ea889751ce21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal\" (UID: \"e75517a8cb8f107b2cf6ea889751ce21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.470002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.438051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e75517a8cb8f107b2cf6ea889751ce21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal\" (UID: \"e75517a8cb8f107b2cf6ea889751ce21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.470002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.438058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/52cc10a5c9c1a625e58617dcad0f895c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-231.ec2.internal\" (UID: \"52cc10a5c9c1a625e58617dcad0f895c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.470002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.438091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e75517a8cb8f107b2cf6ea889751ce21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal\" (UID: \"e75517a8cb8f107b2cf6ea889751ce21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.728247 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.536453 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.728247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.596764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.728247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.602071 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" Apr 22 19:23:06.728247 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.637517 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.737984 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:06.838544 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-231.ec2.internal\" not found" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.845483 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.937781 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.937873 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.937957 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:06.940840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:06.937961 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.010814 2572 apiserver.go:52] "Watching apiserver" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.033144 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.033406 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-hqv99","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw","openshift-dns/node-resolver-zv45q","openshift-image-registry/node-ca-d9v44","openshift-multus/multus-xt29w","openshift-network-diagnostics/network-check-target-qptxp","openshift-ovn-kubernetes/ovnkube-node-2jbww","openshift-cluster-node-tuning-operator/tuned-v2tph","openshift-multus/multus-additional-cni-plugins-gxnvz","openshift-multus/network-metrics-daemon-w244z","openshift-network-operator/iptables-alerter-vvjql"] Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.034947 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.035002 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.036214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.038281 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qn599\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.038422 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.038434 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.038505 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.040402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.040460 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.040709 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8hszl\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.040740 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.040774 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.042022 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1dd58df5-bc2c-432a-844a-887a587be426-agent-certs\") pod \"konnectivity-agent-hqv99\" (UID: \"1dd58df5-bc2c-432a-844a-887a587be426\") " pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.042042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1dd58df5-bc2c-432a-844a-887a587be426-konnectivity-ca\") pod \"konnectivity-agent-hqv99\" (UID: \"1dd58df5-bc2c-432a-844a-887a587be426\") " pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.042653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.042706 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.042873 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xfjtr\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.043215 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.045408 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.045484 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nzrtl\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.046136 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.046171 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.046767 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.046827 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.047502 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.047535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.047547 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.048034 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:07.179112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.049769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.049927 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.050228 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.050430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wz2bv\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.050449 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.050758 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.051773 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.052026 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4bwq8\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.052404 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.052686 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.052920 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.053103 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.053881 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.053961 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.054763 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.055192 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.055328 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.055431 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2hncx\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.055881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.057571 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:06 +0000 UTC" deadline="2027-10-06 14:05:17.565967371 +0000 UTC" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.057601 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12762h42m10.50836867s" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.058035 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.058045 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-78k8d\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.058473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.059394 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.059377 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.062126 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal"] Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.062144 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal"] Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.062209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.064470 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.064783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gk94m\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.064814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.064883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.067764 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8brng" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.075196 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8brng" Apr 22 19:23:07.181385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.136709 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gls2\" (UniqueName: \"kubernetes.io/projected/def7cd86-6b79-4c5f-900f-f09644520b6b-kube-api-access-2gls2\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c867e00-703c-41c3-8964-4eee7b3451c9-tmp-dir\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-node-log\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-ovnkube-script-lib\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/623abfa5-3755-40bb-bd0c-3fbea347ccc5-iptables-alerter-script\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-lib-modules\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-multus-certs\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-log-socket\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-device-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-modprobe-d\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-conf-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/453fe14e-87da-4452-87b2-f3814fcb9406-multus-daemon-config\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-etc-kubernetes\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-system-cni-dir\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-etc-selinux\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.184283 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-sys-fs\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-sys\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c867e00-703c-41c3-8964-4eee7b3451c9-hosts-file\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-netns\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/623abfa5-3755-40bb-bd0c-3fbea347ccc5-host-slash\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-cni-bin\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142725 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhwk\" (UniqueName: \"kubernetes.io/projected/fc27d5a5-d1ec-4f19-823f-585b5366a986-kube-api-access-lxhwk\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-socket-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827ln\" (UniqueName: \"kubernetes.io/projected/f612d772-76e2-4346-ad05-9b85f98f7354-kube-api-access-827ln\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2m9x\" (UniqueName: \"kubernetes.io/projected/453fe14e-87da-4452-87b2-f3814fcb9406-kube-api-access-n2m9x\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1dd58df5-bc2c-432a-844a-887a587be426-konnectivity-ca\") pod \"konnectivity-agent-hqv99\" (UID: \"1dd58df5-bc2c-432a-844a-887a587be426\") " pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysctl-conf\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-cni-binary-copy\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142931 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysctl-d\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.184932 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-socket-dir-parent\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.142992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-systemd-units\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-slash\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svxtb\" (UniqueName: \"kubernetes.io/projected/19a311dd-1aa9-4326-8424-accc5f5f330c-kube-api-access-svxtb\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-var-lib-kubelet\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwmt\" (UniqueName: \"kubernetes.io/projected/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-kube-api-access-vkwmt\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtw9\" (UniqueName: \"kubernetes.io/projected/2c867e00-703c-41c3-8964-4eee7b3451c9-kube-api-access-4mtw9\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1130a4f2-a77f-4484-b9a9-046f3553e57b-host\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-cni-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-os-release\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/453fe14e-87da-4452-87b2-f3814fcb9406-cni-binary-copy\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-cni-multus\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-kubernetes\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1dd58df5-bc2c-432a-844a-887a587be426-konnectivity-ca\") pod \"konnectivity-agent-hqv99\" (UID: \"1dd58df5-bc2c-432a-844a-887a587be426\") " pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-k8s-cni-cncf-io\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-var-lib-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.185687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-ovn\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-cnibin\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysconfig\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-run\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx9c7\" (UniqueName: \"kubernetes.io/projected/1130a4f2-a77f-4484-b9a9-046f3553e57b-kube-api-access-hx9c7\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-systemd\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-os-release\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1dd58df5-bc2c-432a-844a-887a587be426-agent-certs\") pod \"konnectivity-agent-hqv99\" (UID: \"1dd58df5-bc2c-432a-844a-887a587be426\") " pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-cni-bin\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-run-netns\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-cni-netd\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-ovnkube-config\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-cnibin\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.186234 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-kubelet\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-system-cni-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-systemd\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-host\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-tmp\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-env-overrides\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-tuned\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-kubelet\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143982 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-hostroot\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144035 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19a311dd-1aa9-4326-8424-accc5f5f330c-ovn-node-metrics-cert\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.143979 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfbc\" (UniqueName: \"kubernetes.io/projected/623abfa5-3755-40bb-bd0c-3fbea347ccc5-kube-api-access-tmfbc\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144082 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1130a4f2-a77f-4484-b9a9-046f3553e57b-serviceca\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-etc-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.437674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.144126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-registration-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.146784 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1dd58df5-bc2c-432a-844a-887a587be426-agent-certs\") pod \"konnectivity-agent-hqv99\" (UID: \"1dd58df5-bc2c-432a-844a-887a587be426\") " pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-var-lib-kubelet\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwmt\" (UniqueName: \"kubernetes.io/projected/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-kube-api-access-vkwmt\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtw9\" (UniqueName: \"kubernetes.io/projected/2c867e00-703c-41c3-8964-4eee7b3451c9-kube-api-access-4mtw9\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1130a4f2-a77f-4484-b9a9-046f3553e57b-host\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-cni-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-os-release\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/453fe14e-87da-4452-87b2-f3814fcb9406-cni-binary-copy\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-cni-multus\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-kubernetes\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-k8s-cni-cncf-io\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-var-lib-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-var-lib-kubelet\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-cni-multus\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-var-lib-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440077 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-k8s-cni-cncf-io\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1130a4f2-a77f-4484-b9a9-046f3553e57b-host\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-ovn\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-cni-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-cnibin\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-kubernetes\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-cnibin\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-os-release\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysconfig\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-ovn\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysconfig\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-run\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:07.440972 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx9c7\" (UniqueName: \"kubernetes.io/projected/1130a4f2-a77f-4484-b9a9-046f3553e57b-kube-api-access-hx9c7\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-run\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-systemd\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.245974 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-run-systemd\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/453fe14e-87da-4452-87b2-f3814fcb9406-cni-binary-copy\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-os-release\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-cni-bin\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-os-release\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-run-netns\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-cni-netd\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-cni-bin\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-ovnkube-config\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-cnibin\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-run-netns\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-kubelet\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-cni-netd\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-kubelet\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-cnibin\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.441943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-system-cni-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-systemd\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-host\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.246298 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-tmp\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-env-overrides\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-systemd\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-host\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.246359 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:07.746326632 +0000 UTC m=+2.155741139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-system-cni-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-tuned\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-kubelet\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-hostroot\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-var-lib-kubelet\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19a311dd-1aa9-4326-8424-accc5f5f330c-ovn-node-metrics-cert\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.442537 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfbc\" (UniqueName: \"kubernetes.io/projected/623abfa5-3755-40bb-bd0c-3fbea347ccc5-kube-api-access-tmfbc\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1130a4f2-a77f-4484-b9a9-046f3553e57b-serviceca\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-etc-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-registration-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-env-overrides\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gls2\" (UniqueName: \"kubernetes.io/projected/def7cd86-6b79-4c5f-900f-f09644520b6b-kube-api-access-2gls2\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-etc-openvswitch\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c867e00-703c-41c3-8964-4eee7b3451c9-tmp-dir\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-node-log\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-ovnkube-script-lib\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/623abfa5-3755-40bb-bd0c-3fbea347ccc5-iptables-alerter-script\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247021 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-hostroot\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-lib-modules\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-registration-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-multus-certs\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.443128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-log-socket\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-node-log\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1130a4f2-a77f-4484-b9a9-046f3553e57b-serviceca\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.246619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-ovnkube-config\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-device-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-multus-certs\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-modprobe-d\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-device-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-conf-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-log-socket\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/453fe14e-87da-4452-87b2-f3814fcb9406-multus-daemon-config\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-etc-kubernetes\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-system-cni-dir\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-modprobe-d\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-etc-selinux\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c867e00-703c-41c3-8964-4eee7b3451c9-tmp-dir\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-sys-fs\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-sys\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.443678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-etc-selinux\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247429 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-conf-dir\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-lib-modules\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-sys\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-system-cni-dir\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c867e00-703c-41c3-8964-4eee7b3451c9-hosts-file\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-netns\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/623abfa5-3755-40bb-bd0c-3fbea347ccc5-host-slash\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/623abfa5-3755-40bb-bd0c-3fbea347ccc5-iptables-alerter-script\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-cni-bin\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-etc-kubernetes\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/623abfa5-3755-40bb-bd0c-3fbea347ccc5-host-slash\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-host-run-netns\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-cni-bin\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhwk\" (UniqueName: \"kubernetes.io/projected/fc27d5a5-d1ec-4f19-823f-585b5366a986-kube-api-access-lxhwk\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19a311dd-1aa9-4326-8424-accc5f5f330c-ovnkube-script-lib\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c867e00-703c-41c3-8964-4eee7b3451c9-hosts-file\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.444280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/453fe14e-87da-4452-87b2-f3814fcb9406-multus-daemon-config\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-sys-fs\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-socket-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.247982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-827ln\" (UniqueName: \"kubernetes.io/projected/f612d772-76e2-4346-ad05-9b85f98f7354-kube-api-access-827ln\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f612d772-76e2-4346-ad05-9b85f98f7354-socket-dir\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2m9x\" (UniqueName: \"kubernetes.io/projected/453fe14e-87da-4452-87b2-f3814fcb9406-kube-api-access-n2m9x\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysctl-conf\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-cni-binary-copy\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysctl-conf\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc27d5a5-d1ec-4f19-823f-585b5366a986-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysctl-d\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-socket-dir-parent\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-systemd-units\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-slash\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svxtb\" (UniqueName: \"kubernetes.io/projected/19a311dd-1aa9-4326-8424-accc5f5f330c-kube-api-access-svxtb\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.445322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/453fe14e-87da-4452-87b2-f3814fcb9406-multus-socket-dir-parent\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-sysctl-d\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-host-slash\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19a311dd-1aa9-4326-8424-accc5f5f330c-systemd-units\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248857 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-cni-binary-copy\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.248894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-etc-tuned\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.249180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-tmp\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.249362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc27d5a5-d1ec-4f19-823f-585b5366a986-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.249723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19a311dd-1aa9-4326-8424-accc5f5f330c-ovn-node-metrics-cert\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.252943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx9c7\" (UniqueName: \"kubernetes.io/projected/1130a4f2-a77f-4484-b9a9-046f3553e57b-kube-api-access-hx9c7\") pod \"node-ca-d9v44\" (UID: \"1130a4f2-a77f-4484-b9a9-046f3553e57b\") " pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.253099 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.253115 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.253123 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.253174 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:07.75316163 +0000 UTC m=+2.162576120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.255284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfbc\" (UniqueName: \"kubernetes.io/projected/623abfa5-3755-40bb-bd0c-3fbea347ccc5-kube-api-access-tmfbc\") pod \"iptables-alerter-vvjql\" (UID: \"623abfa5-3755-40bb-bd0c-3fbea347ccc5\") " pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.255746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtw9\" (UniqueName: \"kubernetes.io/projected/2c867e00-703c-41c3-8964-4eee7b3451c9-kube-api-access-4mtw9\") pod \"node-resolver-zv45q\" (UID: \"2c867e00-703c-41c3-8964-4eee7b3451c9\") " pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.256040 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52cc10a5c9c1a625e58617dcad0f895c.slice/crio-bb6da56935947c8a908ab9c07b101628b9ef9baed1c9885b3217f6a9ccce983b WatchSource:0}: Error finding container bb6da56935947c8a908ab9c07b101628b9ef9baed1c9885b3217f6a9ccce983b: Status 404 returned error can't find the container with id bb6da56935947c8a908ab9c07b101628b9ef9baed1c9885b3217f6a9ccce983b Apr 22 19:23:07.445903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.256118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwmt\" (UniqueName: \"kubernetes.io/projected/4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef-kube-api-access-vkwmt\") pod \"tuned-v2tph\" (UID: \"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef\") " pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.256574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2m9x\" (UniqueName: \"kubernetes.io/projected/453fe14e-87da-4452-87b2-f3814fcb9406-kube-api-access-n2m9x\") pod \"multus-xt29w\" (UID: \"453fe14e-87da-4452-87b2-f3814fcb9406\") " pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.256841 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode75517a8cb8f107b2cf6ea889751ce21.slice/crio-4923613e481d301d76c2ca40805f26a61decfc37b1bae137798b650c5db7fb9e WatchSource:0}: Error finding container 4923613e481d301d76c2ca40805f26a61decfc37b1bae137798b650c5db7fb9e: Status 404 returned error can't find the container with id 4923613e481d301d76c2ca40805f26a61decfc37b1bae137798b650c5db7fb9e Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.257790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svxtb\" (UniqueName: \"kubernetes.io/projected/19a311dd-1aa9-4326-8424-accc5f5f330c-kube-api-access-svxtb\") pod \"ovnkube-node-2jbww\" (UID: \"19a311dd-1aa9-4326-8424-accc5f5f330c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.257970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-827ln\" (UniqueName: \"kubernetes.io/projected/f612d772-76e2-4346-ad05-9b85f98f7354-kube-api-access-827ln\") pod \"aws-ebs-csi-driver-node-4xtkw\" (UID: \"f612d772-76e2-4346-ad05-9b85f98f7354\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.258083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gls2\" (UniqueName: \"kubernetes.io/projected/def7cd86-6b79-4c5f-900f-f09644520b6b-kube-api-access-2gls2\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.258325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhwk\" (UniqueName: \"kubernetes.io/projected/fc27d5a5-d1ec-4f19-823f-585b5366a986-kube-api-access-lxhwk\") pod \"multus-additional-cni-plugins-gxnvz\" (UID: \"fc27d5a5-d1ec-4f19-823f-585b5366a986\") " pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.265051 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.358627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.363943 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.364044 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd58df5_bc2c_432a_844a_887a587be426.slice/crio-47c016deb10d9b0437f7ae44ab1e5514f7d37d86da6ee4fe8f8a0b5d5b15b4e9 WatchSource:0}: Error finding container 47c016deb10d9b0437f7ae44ab1e5514f7d37d86da6ee4fe8f8a0b5d5b15b4e9: Status 404 returned error can't find the container with id 47c016deb10d9b0437f7ae44ab1e5514f7d37d86da6ee4fe8f8a0b5d5b15b4e9 Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.390092 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zv45q" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.394957 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c867e00_703c_41c3_8964_4eee7b3451c9.slice/crio-c014a2d84ac52cbe0cbc8f1297ab7bb9e22d96126d3154a8a0e84b9a5d626a87 WatchSource:0}: Error finding container c014a2d84ac52cbe0cbc8f1297ab7bb9e22d96126d3154a8a0e84b9a5d626a87: Status 404 returned error can't find the container with id c014a2d84ac52cbe0cbc8f1297ab7bb9e22d96126d3154a8a0e84b9a5d626a87 Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.403478 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.406479 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9v44" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.413608 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1130a4f2_a77f_4484_b9a9_046f3553e57b.slice/crio-fc69b40f541ad4061699d0e5c745a25a220de85bb3a8a33e53f0bcc626b93a79 WatchSource:0}: Error finding container fc69b40f541ad4061699d0e5c745a25a220de85bb3a8a33e53f0bcc626b93a79: Status 404 returned error can't find the container with id fc69b40f541ad4061699d0e5c745a25a220de85bb3a8a33e53f0bcc626b93a79 Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.422728 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xt29w" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.427323 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453fe14e_87da_4452_87b2_f3814fcb9406.slice/crio-ac312d1d30f1a543f77cf93ff65716f2b113ca07437daf4031d02b7132892689 WatchSource:0}: Error finding container ac312d1d30f1a543f77cf93ff65716f2b113ca07437daf4031d02b7132892689: Status 404 returned error can't find the container with id ac312d1d30f1a543f77cf93ff65716f2b113ca07437daf4031d02b7132892689 Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.438645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.444210 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v2tph" Apr 22 19:23:07.446415 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.444361 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19a311dd_1aa9_4326_8424_accc5f5f330c.slice/crio-e55acb3c9fe627a39377d773c1a36ffd61ddc6846091455131bdde89816ebb32 WatchSource:0}: Error finding container e55acb3c9fe627a39377d773c1a36ffd61ddc6846091455131bdde89816ebb32: Status 404 returned error can't find the container with id e55acb3c9fe627a39377d773c1a36ffd61ddc6846091455131bdde89816ebb32 Apr 22 19:23:07.449290 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.449270 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1c5d77_dc78_43c4_92de_a6f2ba9c71ef.slice/crio-a5768487a626c61917472b4696a27a81ce32f0d6cd68ddab9d694c9f6826d93a WatchSource:0}: Error finding container a5768487a626c61917472b4696a27a81ce32f0d6cd68ddab9d694c9f6826d93a: Status 404 returned error can't find the container with id a5768487a626c61917472b4696a27a81ce32f0d6cd68ddab9d694c9f6826d93a Apr 22 19:23:07.461657 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.461641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" Apr 22 19:23:07.467352 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.467329 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc27d5a5_d1ec_4f19_823f_585b5366a986.slice/crio-e0d78e105ca3dfed51eac343eac6d4d704cb0d05d008c223653a8e9090b67361 WatchSource:0}: Error finding container e0d78e105ca3dfed51eac343eac6d4d704cb0d05d008c223653a8e9090b67361: Status 404 returned error can't find the container with id e0d78e105ca3dfed51eac343eac6d4d704cb0d05d008c223653a8e9090b67361 Apr 22 19:23:07.469469 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.469452 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vvjql" Apr 22 19:23:07.474678 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:07.474645 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623abfa5_3755_40bb_bd0c_3fbea347ccc5.slice/crio-c84cc2107f6c28b850dd1952cbe6e63ed52462c83b1b85e8043a560d0ead05a8 WatchSource:0}: Error finding container c84cc2107f6c28b850dd1952cbe6e63ed52462c83b1b85e8043a560d0ead05a8: Status 404 returned error can't find the container with id c84cc2107f6c28b850dd1952cbe6e63ed52462c83b1b85e8043a560d0ead05a8 Apr 22 19:23:07.751595 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.751525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:07.751719 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.751614 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:07.751719 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.751698 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:08.751677912 +0000 UTC m=+3.161092403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:07.852566 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.852529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:07.852733 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.852707 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:07.852809 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.852741 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:07.852809 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.852754 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:07.852809 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:07.852807 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:08.852787449 +0000 UTC m=+3.262201954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:07.882922 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:07.882726 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:08.076771 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.076631 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:07 +0000 UTC" deadline="2027-11-16 17:40:05.603597227 +0000 UTC" Apr 22 19:23:08.076771 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.076675 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13750h16m57.52693988s" Apr 22 19:23:08.189905 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.189803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zv45q" event={"ID":"2c867e00-703c-41c3-8964-4eee7b3451c9","Type":"ContainerStarted","Data":"c014a2d84ac52cbe0cbc8f1297ab7bb9e22d96126d3154a8a0e84b9a5d626a87"} Apr 22 19:23:08.206127 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.206091 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hqv99" event={"ID":"1dd58df5-bc2c-432a-844a-887a587be426","Type":"ContainerStarted","Data":"47c016deb10d9b0437f7ae44ab1e5514f7d37d86da6ee4fe8f8a0b5d5b15b4e9"} Apr 22 19:23:08.216007 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.215979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" event={"ID":"e75517a8cb8f107b2cf6ea889751ce21","Type":"ContainerStarted","Data":"4923613e481d301d76c2ca40805f26a61decfc37b1bae137798b650c5db7fb9e"} Apr 22 19:23:08.226996 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.226954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vvjql" event={"ID":"623abfa5-3755-40bb-bd0c-3fbea347ccc5","Type":"ContainerStarted","Data":"c84cc2107f6c28b850dd1952cbe6e63ed52462c83b1b85e8043a560d0ead05a8"} Apr 22 19:23:08.251232 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.251205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerStarted","Data":"e0d78e105ca3dfed51eac343eac6d4d704cb0d05d008c223653a8e9090b67361"} Apr 22 19:23:08.261410 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.261382 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"e55acb3c9fe627a39377d773c1a36ffd61ddc6846091455131bdde89816ebb32"} Apr 22 19:23:08.265233 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.265207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" event={"ID":"f612d772-76e2-4346-ad05-9b85f98f7354","Type":"ContainerStarted","Data":"d11439b78888c4eef2ec1571be2addd3527760123e4a19c058bb8fb62b0240ef"} Apr 22 19:23:08.278754 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.278724 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" event={"ID":"52cc10a5c9c1a625e58617dcad0f895c","Type":"ContainerStarted","Data":"bb6da56935947c8a908ab9c07b101628b9ef9baed1c9885b3217f6a9ccce983b"} Apr 22 19:23:08.308734 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.307071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v2tph" event={"ID":"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef","Type":"ContainerStarted","Data":"a5768487a626c61917472b4696a27a81ce32f0d6cd68ddab9d694c9f6826d93a"} Apr 22 19:23:08.312280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.312233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xt29w" event={"ID":"453fe14e-87da-4452-87b2-f3814fcb9406","Type":"ContainerStarted","Data":"ac312d1d30f1a543f77cf93ff65716f2b113ca07437daf4031d02b7132892689"} Apr 22 19:23:08.323092 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.323069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9v44" event={"ID":"1130a4f2-a77f-4484-b9a9-046f3553e57b","Type":"ContainerStarted","Data":"fc69b40f541ad4061699d0e5c745a25a220de85bb3a8a33e53f0bcc626b93a79"} Apr 22 19:23:08.758352 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.758317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:08.758543 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:08.758471 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:08.758543 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:08.758528 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:10.758510512 +0000 UTC m=+5.167925010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:08.860135 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:08.859537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:08.860135 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:08.859730 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:08.860135 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:08.859748 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:08.860135 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:08.859761 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:08.860135 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:08.859820 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:10.859800821 +0000 UTC m=+5.269215315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:09.077614 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:09.077500 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:07 +0000 UTC" deadline="2028-01-02 17:01:53.924153734 +0000 UTC" Apr 22 19:23:09.077614 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:09.077535 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14877h38m44.846622048s" Apr 22 19:23:09.091031 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:09.091008 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:09.166133 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:09.166105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:09.166286 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:09.166227 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:09.166645 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:09.166627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:09.166765 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:09.166745 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:10.778166 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:10.777905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:10.778617 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:10.778142 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:10.778617 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:10.778268 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:14.77824843 +0000 UTC m=+9.187662927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:10.879209 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:10.879174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:10.879361 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:10.879336 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:10.879361 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:10.879354 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:10.879506 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:10.879366 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:10.879506 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:10.879421 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:14.879403107 +0000 UTC m=+9.288817613 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:11.165853 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:11.165818 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:11.166048 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:11.165963 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:11.166048 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:11.165965 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:11.166165 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:11.166061 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:13.166332 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:13.165726 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:13.166332 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:13.165861 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:13.166332 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:13.165925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:13.166332 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:13.165990 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:14.811008 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:14.810974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:14.811456 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:14.811141 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:14.811456 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:14.811200 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:22.811182369 +0000 UTC m=+17.220596864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:14.912562 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:14.912528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:14.912773 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:14.912754 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:14.912773 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:14.912773 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:14.912903 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:14.912785 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:14.912903 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:14.912841 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:22.912823823 +0000 UTC m=+17.322238328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:15.165999 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:15.165968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:15.166152 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:15.165968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:15.166152 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:15.166106 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:15.166272 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:15.166171 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:17.165840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:17.165804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:17.166304 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:17.165840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:17.166304 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:17.165937 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:17.166304 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:17.166031 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:19.166149 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:19.166112 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:19.166560 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:19.166112 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:19.166560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:19.166227 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:19.166560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:19.166304 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:21.165872 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:21.165848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:21.165872 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:21.165857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:21.166320 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:21.165947 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:21.166320 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:21.166083 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:22.874095 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:22.874059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:22.874530 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:22.874211 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:22.874530 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:22.874291 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.8742703 +0000 UTC m=+33.283684818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:22.974691 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:22.974646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:22.974842 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:22.974805 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:22.974842 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:22.974828 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:22.974842 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:22.974839 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:22.974989 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:22.974897 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.974875951 +0000 UTC m=+33.384290444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:23.166464 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:23.166435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:23.166622 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:23.166446 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:23.166622 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:23.166536 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:23.166772 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:23.166639 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:25.165483 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:25.165461 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:25.165780 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:25.165461 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:25.165780 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:25.165577 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:25.165780 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:25.165629 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:26.360962 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.360654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"b37a8689df3ef91ce12810dda91175aef560ae5dc980a84f22d5c328f820fb30"} Apr 22 19:23:26.360962 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.360925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"3902c3f9cc72ccb0c42a5d1e75e7fa0add21ac93332b5ba3cace6d0b0169ecf4"} Apr 22 19:23:26.360962 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.360946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"1e7c22bdc25a5c0bc36cd0809f2c181ffd69dc50deb443d6b9ed397fbdb0ae6b"} Apr 22 19:23:26.360962 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.360959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"dda798050f87556fe8bc0628e3a7c4341c04a8680a5ac8522523177775d621cd"} Apr 22 19:23:26.361800 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.360971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"ab07d715a8dbab27a0e71f092f7a9c118d5259ff8e00a79916c8842600c743ae"} Apr 22 19:23:26.361800 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.360985 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"01250dac140d6c96cf9f384941eeb97680c98caa33d3b60c78bbf4857f0abc32"} Apr 22 19:23:26.362313 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.362283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" event={"ID":"52cc10a5c9c1a625e58617dcad0f895c","Type":"ContainerStarted","Data":"6442d058f23afbb9f263d566482ae8775e2950192081f665c4309a638575413c"} Apr 22 19:23:26.364941 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.364919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v2tph" event={"ID":"4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef","Type":"ContainerStarted","Data":"deb1a544c1dfa4252dd26f597a478c0a321750ca85d2f8ee6a593353008aa207"} Apr 22 19:23:26.368649 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.368627 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xt29w" event={"ID":"453fe14e-87da-4452-87b2-f3814fcb9406","Type":"ContainerStarted","Data":"52345af3dd84dc17ce2509fa94dfe75fc4ee44e738a41ef6c4e1ea02081dc7a7"} Apr 22 19:23:26.375216 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.375169 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-231.ec2.internal" podStartSLOduration=19.375154432 podStartE2EDuration="19.375154432s" podCreationTimestamp="2026-04-22 19:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:26.374751239 +0000 UTC m=+20.784165753" watchObservedRunningTime="2026-04-22 19:23:26.375154432 +0000 UTC m=+20.784568948" Apr 22 19:23:26.405931 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.405896 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xt29w" podStartSLOduration=2.423487621 podStartE2EDuration="20.405884659s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.428620735 +0000 UTC m=+1.838035226" lastFinishedPulling="2026-04-22 19:23:25.41101777 +0000 UTC m=+19.820432264" observedRunningTime="2026-04-22 19:23:26.39125575 +0000 UTC m=+20.800670265" watchObservedRunningTime="2026-04-22 19:23:26.405884659 +0000 UTC m=+20.815299172" Apr 22 19:23:26.406913 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:26.406197 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v2tph" podStartSLOduration=2.699473631 podStartE2EDuration="20.406190066s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.450564679 +0000 UTC m=+1.859979169" lastFinishedPulling="2026-04-22 19:23:25.157281112 +0000 UTC m=+19.566695604" observedRunningTime="2026-04-22 19:23:26.405764685 +0000 UTC m=+20.815179208" watchObservedRunningTime="2026-04-22 19:23:26.406190066 +0000 UTC m=+20.815604580" Apr 22 19:23:27.165916 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.165749 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:27.166163 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.165810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:27.166163 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:27.166010 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:27.166163 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:27.166055 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:27.286400 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.286375 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:27.371620 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.371593 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" event={"ID":"f612d772-76e2-4346-ad05-9b85f98f7354","Type":"ContainerStarted","Data":"a05521300d0773df6e26ace36a671b20a8740dca5bcac5c5487ebffca9d1f30e"} Apr 22 19:23:27.372039 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.371627 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" event={"ID":"f612d772-76e2-4346-ad05-9b85f98f7354","Type":"ContainerStarted","Data":"0dfca8a172ef2fe35036ced3bfce69c4fa7a88e2ce113bbdce0c043b680f19d1"} Apr 22 19:23:27.372874 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.372854 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9v44" event={"ID":"1130a4f2-a77f-4484-b9a9-046f3553e57b","Type":"ContainerStarted","Data":"229141b5729a31bf98e4191c3158d8c6a827581faa1d38785d9471134f9dbe2c"} Apr 22 19:23:27.374047 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.374025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zv45q" event={"ID":"2c867e00-703c-41c3-8964-4eee7b3451c9","Type":"ContainerStarted","Data":"1efe3033650d43869d8606c2c8c4083cfcb79682297f3f879995d59af45990d3"} Apr 22 19:23:27.375170 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.375151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hqv99" event={"ID":"1dd58df5-bc2c-432a-844a-887a587be426","Type":"ContainerStarted","Data":"d7f5da6b3e8e170971e0c4f48aaba5362909e23b596a46b2c0e713903e7e0ceb"} Apr 22 19:23:27.376384 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.376365 2572 generic.go:358] "Generic (PLEG): container finished" podID="e75517a8cb8f107b2cf6ea889751ce21" containerID="5042eb43fd05030d0cd64bcfcdb56443e7c31b9c349faec21d1d2b7f40978ee8" exitCode=0 Apr 22 19:23:27.376471 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.376415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" event={"ID":"e75517a8cb8f107b2cf6ea889751ce21","Type":"ContainerDied","Data":"5042eb43fd05030d0cd64bcfcdb56443e7c31b9c349faec21d1d2b7f40978ee8"} Apr 22 19:23:27.377742 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.377721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vvjql" event={"ID":"623abfa5-3755-40bb-bd0c-3fbea347ccc5","Type":"ContainerStarted","Data":"46eb43a9f6eb97ba7c888d5bf28f3d6607c83933b0629124ffcef08eb1a0ec4b"} Apr 22 19:23:27.379013 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.378993 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc27d5a5-d1ec-4f19-823f-585b5366a986" containerID="e63ae1479752ba7e384130650a0dcf52862857f14e5a4114eff4dc1ce70dbfa7" exitCode=0 Apr 22 19:23:27.379100 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.379075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerDied","Data":"e63ae1479752ba7e384130650a0dcf52862857f14e5a4114eff4dc1ce70dbfa7"} Apr 22 19:23:27.386058 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.386022 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d9v44" podStartSLOduration=3.6462034169999997 podStartE2EDuration="21.386011548s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.414876119 +0000 UTC m=+1.824290610" lastFinishedPulling="2026-04-22 19:23:25.154684244 +0000 UTC m=+19.564098741" observedRunningTime="2026-04-22 19:23:27.385680042 +0000 UTC m=+21.795094553" watchObservedRunningTime="2026-04-22 19:23:27.386011548 +0000 UTC m=+21.795426061" Apr 22 19:23:27.397918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.397885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hqv99" podStartSLOduration=3.639438109 podStartE2EDuration="21.397874914s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.36568047 +0000 UTC m=+1.775094961" lastFinishedPulling="2026-04-22 19:23:25.124117273 +0000 UTC m=+19.533531766" observedRunningTime="2026-04-22 19:23:27.397066931 +0000 UTC m=+21.806481444" watchObservedRunningTime="2026-04-22 19:23:27.397874914 +0000 UTC m=+21.807289426" Apr 22 19:23:27.409632 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.409594 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vvjql" podStartSLOduration=3.730758252 podStartE2EDuration="21.409584208s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.475953809 +0000 UTC m=+1.885368304" lastFinishedPulling="2026-04-22 19:23:25.154779754 +0000 UTC m=+19.564194260" observedRunningTime="2026-04-22 19:23:27.40941134 +0000 UTC m=+21.818825854" watchObservedRunningTime="2026-04-22 19:23:27.409584208 +0000 UTC m=+21.818998720" Apr 22 19:23:27.423492 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:27.423430 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zv45q" podStartSLOduration=3.695608435 podStartE2EDuration="21.423419333s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.396249082 +0000 UTC m=+1.805663577" lastFinishedPulling="2026-04-22 19:23:25.124059966 +0000 UTC m=+19.533474475" observedRunningTime="2026-04-22 19:23:27.422657448 +0000 UTC m=+21.832071962" watchObservedRunningTime="2026-04-22 19:23:27.423419333 +0000 UTC m=+21.832833846" Apr 22 19:23:28.117376 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.116788 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:27.286396144Z","UUID":"089470c3-d322-4bd4-9e65-aac547602eab","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:28.118586 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.118564 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:28.118721 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.118605 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:28.385440 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.385362 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" event={"ID":"e75517a8cb8f107b2cf6ea889751ce21","Type":"ContainerStarted","Data":"13a64c0de7d54b502321f26ae769cb7efb4028653e266a0d2f2b18902dd00dc7"} Apr 22 19:23:28.388435 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.388406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"a67c71dca2b84eba27e78b871d294ddb4700c2dc0854e144a8f2066315b9c0ef"} Apr 22 19:23:28.393135 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.393107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" event={"ID":"f612d772-76e2-4346-ad05-9b85f98f7354","Type":"ContainerStarted","Data":"6dd25b26b90c1974b44eaf79225e0cf5cfb6676c36f851287ff5bbbb57032426"} Apr 22 19:23:28.400647 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:28.400584 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-231.ec2.internal" podStartSLOduration=21.400569923 podStartE2EDuration="21.400569923s" podCreationTimestamp="2026-04-22 19:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:28.399985872 +0000 UTC m=+22.809400391" watchObservedRunningTime="2026-04-22 19:23:28.400569923 +0000 UTC m=+22.809984436" Apr 22 19:23:29.165769 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:29.165712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:29.165949 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:29.165712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:29.165949 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:29.165831 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:29.165949 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:29.165907 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:29.436482 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:29.436403 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:29.437135 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:29.437114 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:29.453567 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:29.453528 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4xtkw" podStartSLOduration=2.801630167 podStartE2EDuration="23.453515384s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.371301288 +0000 UTC m=+1.780715793" lastFinishedPulling="2026-04-22 19:23:28.023186501 +0000 UTC m=+22.432601010" observedRunningTime="2026-04-22 19:23:28.41389053 +0000 UTC m=+22.823305043" watchObservedRunningTime="2026-04-22 19:23:29.453515384 +0000 UTC m=+23.862929896" Apr 22 19:23:30.396212 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:30.395978 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:30.396512 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:30.396493 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hqv99" Apr 22 19:23:31.165754 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.165715 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:31.166133 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.165715 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:31.166133 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:31.165859 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:31.166133 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:31.165877 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:31.401482 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.401416 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" event={"ID":"19a311dd-1aa9-4326-8424-accc5f5f330c","Type":"ContainerStarted","Data":"a1e511d96208bba6e52edb58a8e98c565e9a11421bb0f7082149edab507a673a"} Apr 22 19:23:31.401713 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.401539 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:31.401713 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.401562 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:31.416463 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.416405 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:31.417576 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.417557 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:31.432392 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:31.432338 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" podStartSLOduration=7.128371334 podStartE2EDuration="25.432319853s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.4468299 +0000 UTC m=+1.856244391" lastFinishedPulling="2026-04-22 19:23:25.750778414 +0000 UTC m=+20.160192910" observedRunningTime="2026-04-22 19:23:31.431745186 +0000 UTC m=+25.841159716" watchObservedRunningTime="2026-04-22 19:23:31.432319853 +0000 UTC m=+25.841734368" Apr 22 19:23:32.142209 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.141982 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-84wfc"] Apr 22 19:23:32.159169 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.159147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.159302 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:32.159210 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-84wfc" podUID="91f11754-ebaa-489c-8dad-189955ae35aa" Apr 22 19:23:32.242218 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.242190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/91f11754-ebaa-489c-8dad-189955ae35aa-kubelet-config\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.242917 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.242238 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/91f11754-ebaa-489c-8dad-189955ae35aa-dbus\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.242917 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.242298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.343388 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.343358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/91f11754-ebaa-489c-8dad-189955ae35aa-kubelet-config\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.343534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.343406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/91f11754-ebaa-489c-8dad-189955ae35aa-dbus\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.343534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.343490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/91f11754-ebaa-489c-8dad-189955ae35aa-kubelet-config\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.343534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.343525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/91f11754-ebaa-489c-8dad-189955ae35aa-dbus\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.343659 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.343539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.343659 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:32.343637 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.343753 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:32.343702 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret podName:91f11754-ebaa-489c-8dad-189955ae35aa nodeName:}" failed. No retries permitted until 2026-04-22 19:23:32.843688235 +0000 UTC m=+27.253102728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret") pod "global-pull-secret-syncer-84wfc" (UID: "91f11754-ebaa-489c-8dad-189955ae35aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.403845 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.403818 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc27d5a5-d1ec-4f19-823f-585b5366a986" containerID="6dcb93ab6ea3f869b0b8d073722b8b29a4e25f3efd0c12b4078dc2beb71c68b4" exitCode=0 Apr 22 19:23:32.403980 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.403910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerDied","Data":"6dcb93ab6ea3f869b0b8d073722b8b29a4e25f3efd0c12b4078dc2beb71c68b4"} Apr 22 19:23:32.404043 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.404027 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:32.847293 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:32.847214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:32.847525 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:32.847374 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.847525 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:32.847460 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret podName:91f11754-ebaa-489c-8dad-189955ae35aa nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.847439768 +0000 UTC m=+28.256854262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret") pod "global-pull-secret-syncer-84wfc" (UID: "91f11754-ebaa-489c-8dad-189955ae35aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.166082 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.166054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:33.166195 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.166054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:33.166254 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.166203 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:33.166254 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.166237 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:33.354631 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.354594 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w244z"] Apr 22 19:23:33.358994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.358959 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qptxp"] Apr 22 19:23:33.360586 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.360561 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-84wfc"] Apr 22 19:23:33.360726 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.360711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:33.360878 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.360845 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-84wfc" podUID="91f11754-ebaa-489c-8dad-189955ae35aa" Apr 22 19:23:33.407508 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.407480 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc27d5a5-d1ec-4f19-823f-585b5366a986" containerID="2c3906c35a009192c8e383325d7f25dddb3244ac945ced50a00d748954239eb7" exitCode=0 Apr 22 19:23:33.407648 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.407591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerDied","Data":"2c3906c35a009192c8e383325d7f25dddb3244ac945ced50a00d748954239eb7"} Apr 22 19:23:33.407648 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.407619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:33.407648 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.407624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:33.407796 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.407755 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:33.407899 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.407879 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:33.408024 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.408005 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:33.855062 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:33.854981 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:33.855213 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.855136 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.855274 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:33.855212 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret podName:91f11754-ebaa-489c-8dad-189955ae35aa nodeName:}" failed. No retries permitted until 2026-04-22 19:23:35.855191445 +0000 UTC m=+30.264605950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret") pod "global-pull-secret-syncer-84wfc" (UID: "91f11754-ebaa-489c-8dad-189955ae35aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:34.411170 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:34.411139 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc27d5a5-d1ec-4f19-823f-585b5366a986" containerID="6ebb9194275a24039d9c90179d8b636df57729405df754d7caba21432f9be645" exitCode=0 Apr 22 19:23:34.411444 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:34.411192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerDied","Data":"6ebb9194275a24039d9c90179d8b636df57729405df754d7caba21432f9be645"} Apr 22 19:23:35.166065 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:35.166037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:35.166238 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:35.166076 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:35.166238 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:35.166075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:35.166238 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:35.166179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:35.166403 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:35.166252 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:35.166403 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:35.166352 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-84wfc" podUID="91f11754-ebaa-489c-8dad-189955ae35aa" Apr 22 19:23:35.656336 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:35.656302 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:23:35.870810 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:35.870771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:35.870966 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:35.870877 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:35.870966 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:35.870936 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret podName:91f11754-ebaa-489c-8dad-189955ae35aa nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.870923208 +0000 UTC m=+34.280337699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret") pod "global-pull-secret-syncer-84wfc" (UID: "91f11754-ebaa-489c-8dad-189955ae35aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:37.166488 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:37.166458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:37.167038 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:37.166458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:37.167038 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:37.166590 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:23:37.167038 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:37.166458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:37.167038 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:37.166649 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qptxp" podUID="90bba156-f068-4b96-a366-bae94c48b2b6" Apr 22 19:23:37.167038 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:37.166774 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-84wfc" podUID="91f11754-ebaa-489c-8dad-189955ae35aa" Apr 22 19:23:38.411119 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.411089 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-231.ec2.internal" event="NodeReady" Apr 22 19:23:38.411619 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.411240 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:23:38.456952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.456924 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cr9b2"] Apr 22 19:23:38.489424 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.489375 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bv9cr"] Apr 22 19:23:38.489587 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.489535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.492289 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.492256 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:23:38.492422 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.492403 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6f5sr\"" Apr 22 19:23:38.493159 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.492681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:23:38.504303 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.504278 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cr9b2"] Apr 22 19:23:38.504303 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.504304 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bv9cr"] Apr 22 19:23:38.504473 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.504398 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:38.507052 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.507028 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:23:38.507614 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.507592 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:23:38.507747 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.507640 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrkhq\"" Apr 22 19:23:38.508288 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.508266 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:23:38.592118 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.592083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vtx7\" (UniqueName: \"kubernetes.io/projected/b3237a2b-dfbc-4c60-8166-c94d61b4467f-kube-api-access-8vtx7\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.592303 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.592130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2grk8\" (UniqueName: \"kubernetes.io/projected/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-kube-api-access-2grk8\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:38.592303 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.592222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.592303 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.592279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3237a2b-dfbc-4c60-8166-c94d61b4467f-config-volume\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.592460 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.592344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:38.592460 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.592383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3237a2b-dfbc-4c60-8166-c94d61b4467f-tmp-dir\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.693489 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.693400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3237a2b-dfbc-4c60-8166-c94d61b4467f-config-volume\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.693489 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.693446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:38.693732 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.693558 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:38.693732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.693630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3237a2b-dfbc-4c60-8166-c94d61b4467f-tmp-dir\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.693732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.693711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vtx7\" (UniqueName: \"kubernetes.io/projected/b3237a2b-dfbc-4c60-8166-c94d61b4467f-kube-api-access-8vtx7\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.693909 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.693752 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.193723823 +0000 UTC m=+33.603138323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:23:38.693909 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.693808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2grk8\" (UniqueName: \"kubernetes.io/projected/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-kube-api-access-2grk8\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:38.693909 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.693871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.694065 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.693971 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:38.694065 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.694042 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.194028094 +0000 UTC m=+33.603442587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:23:38.694157 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.694110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3237a2b-dfbc-4c60-8166-c94d61b4467f-config-volume\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.704511 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.704451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3237a2b-dfbc-4c60-8166-c94d61b4467f-tmp-dir\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.704705 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.704679 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vtx7\" (UniqueName: \"kubernetes.io/projected/b3237a2b-dfbc-4c60-8166-c94d61b4467f-kube-api-access-8vtx7\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:38.704819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.704738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2grk8\" (UniqueName: \"kubernetes.io/projected/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-kube-api-access-2grk8\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:38.895842 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.895800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:38.896029 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.895963 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:38.896092 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.896041 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:10.896025984 +0000 UTC m=+65.305440477 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:38.996244 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:38.996153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:38.996412 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.996331 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:38.996412 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.996356 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:38.996412 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.996370 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tlhr5 for pod openshift-network-diagnostics/network-check-target-qptxp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:38.996553 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:38.996436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5 podName:90bba156-f068-4b96-a366-bae94c48b2b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:10.996415787 +0000 UTC m=+65.405830279 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlhr5" (UniqueName: "kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5") pod "network-check-target-qptxp" (UID: "90bba156-f068-4b96-a366-bae94c48b2b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:39.166242 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.166199 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:39.166401 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.166284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:23:39.166483 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.166399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:23:39.169623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.169592 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:23:39.169623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.169608 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:23:39.169623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.169616 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ff584\"" Apr 22 19:23:39.169623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.169627 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8qpcz\"" Apr 22 19:23:39.169920 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.169599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:23:39.169920 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.169724 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:23:39.198195 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.198158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:39.198296 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.198252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:39.198346 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:39.198306 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:39.198405 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:39.198373 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.198353802 +0000 UTC m=+34.607768298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:23:39.198405 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:39.198376 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:39.198483 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:39.198409 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.198399914 +0000 UTC m=+34.607814409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:23:39.903501 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.903461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:39.906006 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:39.905978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/91f11754-ebaa-489c-8dad-189955ae35aa-original-pull-secret\") pod \"global-pull-secret-syncer-84wfc\" (UID: \"91f11754-ebaa-489c-8dad-189955ae35aa\") " pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:40.078987 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:40.078947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-84wfc" Apr 22 19:23:40.206452 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:40.206416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:40.206592 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:40.206479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:40.206592 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:40.206560 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:40.206729 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:40.206600 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:40.206729 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:40.206623 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:42.206604565 +0000 UTC m=+36.616019077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:23:40.206729 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:40.206647 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:42.206631604 +0000 UTC m=+36.616046100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:23:40.354406 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:40.354174 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-84wfc"] Apr 22 19:23:40.358768 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:23:40.358744 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f11754_ebaa_489c_8dad_189955ae35aa.slice/crio-f12c318876493ed5d6bbe108b2c24a5fe4660a218ac6f535778252a0b4fd9549 WatchSource:0}: Error finding container f12c318876493ed5d6bbe108b2c24a5fe4660a218ac6f535778252a0b4fd9549: Status 404 returned error can't find the container with id f12c318876493ed5d6bbe108b2c24a5fe4660a218ac6f535778252a0b4fd9549 Apr 22 19:23:40.423304 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:40.423275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-84wfc" event={"ID":"91f11754-ebaa-489c-8dad-189955ae35aa","Type":"ContainerStarted","Data":"f12c318876493ed5d6bbe108b2c24a5fe4660a218ac6f535778252a0b4fd9549"} Apr 22 19:23:41.427025 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:41.426996 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc27d5a5-d1ec-4f19-823f-585b5366a986" containerID="b55e924de85e06657e889f4f12aac0e91f98bf56352de5e5579e87cc46f491fa" exitCode=0 Apr 22 19:23:41.427437 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:41.427047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerDied","Data":"b55e924de85e06657e889f4f12aac0e91f98bf56352de5e5579e87cc46f491fa"} Apr 22 19:23:42.223301 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:42.223271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:42.223477 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:42.223333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:42.223477 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:42.223411 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:42.223477 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:42.223466 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.223452348 +0000 UTC m=+40.632866843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:23:42.223606 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:42.223414 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:42.223606 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:42.223536 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.223524496 +0000 UTC m=+40.632938986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:23:42.432065 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:42.432034 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc27d5a5-d1ec-4f19-823f-585b5366a986" containerID="ea2248f311eb2e8a63d5078b0be7b25dc171693915ce1be81e088dd01912a9b8" exitCode=0 Apr 22 19:23:42.432435 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:42.432092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerDied","Data":"ea2248f311eb2e8a63d5078b0be7b25dc171693915ce1be81e088dd01912a9b8"} Apr 22 19:23:43.437993 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:43.437952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" event={"ID":"fc27d5a5-d1ec-4f19-823f-585b5366a986","Type":"ContainerStarted","Data":"a91c57252f866969f61b59b77a0a525b326a85e3f8aeded4009fdda8c99382d8"} Apr 22 19:23:43.465415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:43.465373 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gxnvz" podStartSLOduration=4.430326958 podStartE2EDuration="37.465360514s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:07.469120547 +0000 UTC m=+1.878535038" lastFinishedPulling="2026-04-22 19:23:40.504154101 +0000 UTC m=+34.913568594" observedRunningTime="2026-04-22 19:23:43.463113249 +0000 UTC m=+37.872527761" watchObservedRunningTime="2026-04-22 19:23:43.465360514 +0000 UTC m=+37.874775026" Apr 22 19:23:46.250541 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:46.250343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:46.250847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:46.250587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:46.250847 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:46.250490 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:46.250847 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:46.250683 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:54.250646411 +0000 UTC m=+48.660060956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:23:46.250847 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:46.250732 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:46.250847 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:46.250775 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:54.250762025 +0000 UTC m=+48.660176523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:23:46.444950 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:46.444913 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-84wfc" event={"ID":"91f11754-ebaa-489c-8dad-189955ae35aa","Type":"ContainerStarted","Data":"b3079f8890cca116ad7a3130f6662083613466082a17fb7f02144aecf2268f2b"} Apr 22 19:23:46.460903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:46.460853 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-84wfc" podStartSLOduration=8.709936477 podStartE2EDuration="14.460838385s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:23:40.360442471 +0000 UTC m=+34.769856966" lastFinishedPulling="2026-04-22 19:23:46.111344383 +0000 UTC m=+40.520758874" observedRunningTime="2026-04-22 19:23:46.460441999 +0000 UTC m=+40.869856513" watchObservedRunningTime="2026-04-22 19:23:46.460838385 +0000 UTC m=+40.870252892" Apr 22 19:23:54.302780 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:54.302740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:23:54.303270 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:23:54.302806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:23:54.303270 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:54.302924 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:54.303270 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:54.302938 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:54.303270 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:54.302979 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:24:10.302964341 +0000 UTC m=+64.712378832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:23:54.303270 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:23:54.303016 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:10.302995861 +0000 UTC m=+64.712410395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:24:05.668012 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:05.667976 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jbww" Apr 22 19:24:10.303144 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:10.303111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:24:10.303144 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:10.303158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:24:10.303560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:10.303258 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:10.303560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:10.303272 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:10.303560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:10.303316 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:24:42.303300608 +0000 UTC m=+96.712715103 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:24:10.303560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:10.303330 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:42.30332412 +0000 UTC m=+96.712738611 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:24:10.906712 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:10.906679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:24:10.911984 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:10.911962 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:10.917491 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:10.917471 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:10.917582 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:10.917539 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:25:14.91751483 +0000 UTC m=+129.326929338 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : secret "metrics-daemon-secret" not found Apr 22 19:24:11.007796 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.007769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:24:11.010857 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.010838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:11.022678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.022634 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:11.034109 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.034088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlhr5\" (UniqueName: \"kubernetes.io/projected/90bba156-f068-4b96-a366-bae94c48b2b6-kube-api-access-tlhr5\") pod \"network-check-target-qptxp\" (UID: \"90bba156-f068-4b96-a366-bae94c48b2b6\") " pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:24:11.289161 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.289088 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8qpcz\"" Apr 22 19:24:11.297366 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.297349 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:24:11.411243 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.411214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qptxp"] Apr 22 19:24:11.415334 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:24:11.415303 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90bba156_f068_4b96_a366_bae94c48b2b6.slice/crio-d61dfceeb469932b13d452d40ab9b0c46af3ad4a8a78d2b5651eeab3eb52b3c1 WatchSource:0}: Error finding container d61dfceeb469932b13d452d40ab9b0c46af3ad4a8a78d2b5651eeab3eb52b3c1: Status 404 returned error can't find the container with id d61dfceeb469932b13d452d40ab9b0c46af3ad4a8a78d2b5651eeab3eb52b3c1 Apr 22 19:24:11.491577 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:11.491544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qptxp" event={"ID":"90bba156-f068-4b96-a366-bae94c48b2b6","Type":"ContainerStarted","Data":"d61dfceeb469932b13d452d40ab9b0c46af3ad4a8a78d2b5651eeab3eb52b3c1"} Apr 22 19:24:14.498573 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:14.498533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qptxp" event={"ID":"90bba156-f068-4b96-a366-bae94c48b2b6","Type":"ContainerStarted","Data":"2eca5cfe06931449741eff7a0d8a48c5b10f4874b89baf36673e13bed2d6ff4e"} Apr 22 19:24:14.499003 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:14.498643 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:24:14.514295 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:14.514251 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qptxp" podStartSLOduration=65.856447073 podStartE2EDuration="1m8.514239293s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:24:11.41752816 +0000 UTC m=+65.826942665" lastFinishedPulling="2026-04-22 19:24:14.075320395 +0000 UTC m=+68.484734885" observedRunningTime="2026-04-22 19:24:14.51355801 +0000 UTC m=+68.922972522" watchObservedRunningTime="2026-04-22 19:24:14.514239293 +0000 UTC m=+68.923653805" Apr 22 19:24:42.308443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:42.308293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:24:42.308443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:42.308351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:24:42.309078 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:42.308447 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:42.309078 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:42.308456 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:42.309078 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:42.308514 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:25:46.308499456 +0000 UTC m=+160.717913946 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:24:42.309078 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:24:42.308530 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:46.308523275 +0000 UTC m=+160.717937766 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:24:45.502500 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:24:45.502468 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qptxp" Apr 22 19:25:14.920446 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:14.920408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:25:14.920957 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:14.920518 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:14.920957 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:14.920581 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs podName:def7cd86-6b79-4c5f-900f-f09644520b6b nodeName:}" failed. No retries permitted until 2026-04-22 19:27:16.920567165 +0000 UTC m=+251.329981655 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs") pod "network-metrics-daemon-w244z" (UID: "def7cd86-6b79-4c5f-900f-f09644520b6b") : secret "metrics-daemon-secret" not found Apr 22 19:25:36.551179 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.551145 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8"] Apr 22 19:25:36.554010 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.553991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.555716 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.555689 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-74d655c858-jl4ng"] Apr 22 19:25:36.557345 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.557326 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.557628 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.557614 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.558292 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.558276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.561492 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.561466 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jvwkk\"" Apr 22 19:25:36.561746 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.561722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:25:36.561893 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.561866 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:25:36.561969 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.561939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:25:36.562351 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.562330 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:25:36.562586 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.562568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:25:36.564868 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.564844 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.565738 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.565721 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:25:36.566113 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.566075 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.566498 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.566481 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bzm9l\"" Apr 22 19:25:36.569065 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.569043 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8"] Apr 22 19:25:36.577119 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.577098 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74d655c858-jl4ng"] Apr 22 19:25:36.659628 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.659603 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-294vl"] Apr 22 19:25:36.662539 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662523 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks"] Apr 22 19:25:36.662683 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662652 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.662744 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-stats-auth\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.662801 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-default-certificate\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.662801 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.662910 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72t49\" (UniqueName: \"kubernetes.io/projected/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-kube-api-access-72t49\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.662910 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.662910 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxqdc\" (UniqueName: \"kubernetes.io/projected/7782c208-eb03-4951-80cf-8089fcbf8cb4-kube-api-access-bxqdc\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.663090 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.662946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.663090 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.663014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7782c208-eb03-4951-80cf-8089fcbf8cb4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.665307 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.665287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:36.678378 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.678360 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:25:36.680030 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.680008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:25:36.680593 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.680572 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-8hghw\"" Apr 22 19:25:36.680593 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.680584 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.680765 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.680609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.680765 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.680645 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.680946 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.680915 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:25:36.681047 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.681030 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-294vl"] Apr 22 19:25:36.681320 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.681303 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xgfsz\"" Apr 22 19:25:36.681596 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.681581 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.681878 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.681861 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks"] Apr 22 19:25:36.690160 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.690143 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:25:36.763409 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-stats-auth\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.763496 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-trusted-ca\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.763496 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-config\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.763496 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-default-certificate\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.763635 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4525z\" (UniqueName: \"kubernetes.io/projected/f550de78-8350-40bf-9d1b-4014d7f2fd62-kube-api-access-4525z\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:36.763635 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl5c7\" (UniqueName: \"kubernetes.io/projected/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-kube-api-access-kl5c7\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.763770 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.763770 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72t49\" (UniqueName: \"kubernetes.io/projected/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-kube-api-access-72t49\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.763770 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.763916 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.763820 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:36.763916 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxqdc\" (UniqueName: \"kubernetes.io/projected/7782c208-eb03-4951-80cf-8089fcbf8cb4-kube-api-access-bxqdc\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.763916 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.763820 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:36.763916 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.763870 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls podName:7782c208-eb03-4951-80cf-8089fcbf8cb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:37.263856842 +0000 UTC m=+151.673271337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-95jw8" (UID: "7782c208-eb03-4951-80cf-8089fcbf8cb4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:36.763916 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.764143 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.763937 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:37.263903767 +0000 UTC m=+151.673318265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : secret "router-metrics-certs-default" not found Apr 22 19:25:36.764143 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.763980 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:37.263969937 +0000 UTC m=+151.673384438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:36.764143 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.763978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:36.764143 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.764033 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-serving-cert\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.764143 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.764066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7782c208-eb03-4951-80cf-8089fcbf8cb4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.765330 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.765315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7782c208-eb03-4951-80cf-8089fcbf8cb4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.765831 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.765811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-stats-auth\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.765967 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.765949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-default-certificate\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.778373 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.778349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxqdc\" (UniqueName: \"kubernetes.io/projected/7782c208-eb03-4951-80cf-8089fcbf8cb4-kube-api-access-bxqdc\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:36.779494 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.779475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72t49\" (UniqueName: \"kubernetes.io/projected/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-kube-api-access-72t49\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:36.864347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.864287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-config\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.864347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.864332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4525z\" (UniqueName: \"kubernetes.io/projected/f550de78-8350-40bf-9d1b-4014d7f2fd62-kube-api-access-4525z\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:36.864347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.864351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl5c7\" (UniqueName: \"kubernetes.io/projected/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-kube-api-access-kl5c7\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.864547 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.864454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:36.864547 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.864486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-serving-cert\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.864547 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.864523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-trusted-ca\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.864547 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.864528 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:36.864712 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:36.864579 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls podName:f550de78-8350-40bf-9d1b-4014d7f2fd62 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:37.364564935 +0000 UTC m=+151.773979472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8l2ks" (UID: "f550de78-8350-40bf-9d1b-4014d7f2fd62") : secret "samples-operator-tls" not found Apr 22 19:25:36.865381 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.865359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-config\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.865553 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.865536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-trusted-ca\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.866650 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.866633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-serving-cert\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.874377 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.874354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4525z\" (UniqueName: \"kubernetes.io/projected/f550de78-8350-40bf-9d1b-4014d7f2fd62-kube-api-access-4525z\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:36.874453 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.874437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl5c7\" (UniqueName: \"kubernetes.io/projected/94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1-kube-api-access-kl5c7\") pod \"console-operator-9d4b6777b-294vl\" (UID: \"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1\") " pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:36.971466 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:36.971444 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:37.082573 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:37.082546 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-294vl"] Apr 22 19:25:37.086887 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:25:37.086860 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a5da67_cf92_4ff3_ace9_bb2d6bf8bcf1.slice/crio-503bb3b08ab3de5733dcac84ed4f196e38e1a490ca2fb630e69a97171676a77d WatchSource:0}: Error finding container 503bb3b08ab3de5733dcac84ed4f196e38e1a490ca2fb630e69a97171676a77d: Status 404 returned error can't find the container with id 503bb3b08ab3de5733dcac84ed4f196e38e1a490ca2fb630e69a97171676a77d Apr 22 19:25:37.267535 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:37.267505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:37.267651 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:37.267553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:37.267651 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:37.267586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:37.267798 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.267650 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:37.267798 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.267722 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:37.267798 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.267727 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:38.267707744 +0000 UTC m=+152.677122236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : secret "router-metrics-certs-default" not found Apr 22 19:25:37.267798 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.267768 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:38.267755584 +0000 UTC m=+152.677170076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:37.267798 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.267779 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls podName:7782c208-eb03-4951-80cf-8089fcbf8cb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:38.267773072 +0000 UTC m=+152.677187563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-95jw8" (UID: "7782c208-eb03-4951-80cf-8089fcbf8cb4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:37.367893 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:37.367870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:37.367986 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.367955 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:37.368032 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:37.368001 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls podName:f550de78-8350-40bf-9d1b-4014d7f2fd62 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:38.36799214 +0000 UTC m=+152.777406631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8l2ks" (UID: "f550de78-8350-40bf-9d1b-4014d7f2fd62") : secret "samples-operator-tls" not found Apr 22 19:25:37.658506 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:37.658450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" event={"ID":"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1","Type":"ContainerStarted","Data":"503bb3b08ab3de5733dcac84ed4f196e38e1a490ca2fb630e69a97171676a77d"} Apr 22 19:25:38.276950 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:38.276913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:38.277124 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:38.276958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:38.277124 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:38.276982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:38.277124 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.277065 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:38.277282 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.277065 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:38.277282 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.277142 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.277126133 +0000 UTC m=+154.686540629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : secret "router-metrics-certs-default" not found Apr 22 19:25:38.277282 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.277156 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.277149989 +0000 UTC m=+154.686564480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:38.277282 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.277165 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls podName:7782c208-eb03-4951-80cf-8089fcbf8cb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.277160287 +0000 UTC m=+154.686574778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-95jw8" (UID: "7782c208-eb03-4951-80cf-8089fcbf8cb4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:38.377947 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:38.377920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:38.378097 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.378061 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:38.378147 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:38.378128 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls podName:f550de78-8350-40bf-9d1b-4014d7f2fd62 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.378109291 +0000 UTC m=+154.787523798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8l2ks" (UID: "f550de78-8350-40bf-9d1b-4014d7f2fd62") : secret "samples-operator-tls" not found Apr 22 19:25:39.664151 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:39.664121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/0.log" Apr 22 19:25:39.664549 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:39.664167 2572 generic.go:358] "Generic (PLEG): container finished" podID="94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1" containerID="e3d2a2439e271aeef440856694ff6743b9479613d762282b1c50a02ff6cd8289" exitCode=255 Apr 22 19:25:39.664549 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:39.664239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" event={"ID":"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1","Type":"ContainerDied","Data":"e3d2a2439e271aeef440856694ff6743b9479613d762282b1c50a02ff6cd8289"} Apr 22 19:25:39.664549 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:39.664480 2572 scope.go:117] "RemoveContainer" containerID="e3d2a2439e271aeef440856694ff6743b9479613d762282b1c50a02ff6cd8289" Apr 22 19:25:40.295002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.294971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:40.295149 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.295012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:40.295149 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.295035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:40.295149 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.295130 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:40.295149 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.295146 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:44.29513362 +0000 UTC m=+158.704548115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:40.295300 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.295180 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:44.295167621 +0000 UTC m=+158.704582112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : secret "router-metrics-certs-default" not found Apr 22 19:25:40.295300 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.295227 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:40.295300 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.295273 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls podName:7782c208-eb03-4951-80cf-8089fcbf8cb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:44.295262358 +0000 UTC m=+158.704676849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-95jw8" (UID: "7782c208-eb03-4951-80cf-8089fcbf8cb4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:40.396316 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.396289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:40.396445 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.396400 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:40.396512 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.396451 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls podName:f550de78-8350-40bf-9d1b-4014d7f2fd62 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:44.396436867 +0000 UTC m=+158.805851358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8l2ks" (UID: "f550de78-8350-40bf-9d1b-4014d7f2fd62") : secret "samples-operator-tls" not found Apr 22 19:25:40.667488 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.667463 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:25:40.667940 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.667918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/0.log" Apr 22 19:25:40.668092 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.667969 2572 generic.go:358] "Generic (PLEG): container finished" podID="94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1" containerID="11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053" exitCode=255 Apr 22 19:25:40.668092 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.668022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" event={"ID":"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1","Type":"ContainerDied","Data":"11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053"} Apr 22 19:25:40.668092 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.668085 2572 scope.go:117] "RemoveContainer" containerID="e3d2a2439e271aeef440856694ff6743b9479613d762282b1c50a02ff6cd8289" Apr 22 19:25:40.668309 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.668273 2572 scope.go:117] "RemoveContainer" containerID="11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053" Apr 22 19:25:40.668510 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:40.668485 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-294vl_openshift-console-operator(94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" podUID="94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1" Apr 22 19:25:40.881061 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.881034 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf"] Apr 22 19:25:40.884050 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.884034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" Apr 22 19:25:40.887701 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.887677 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kpt62\"" Apr 22 19:25:40.887800 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.887700 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:40.887800 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.887706 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:25:40.894980 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:40.894960 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf"] Apr 22 19:25:41.001646 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.001586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshks\" (UniqueName: \"kubernetes.io/projected/d96bb751-7faf-40c6-b20a-4ce5e5009c3e-kube-api-access-qshks\") pod \"migrator-74bb7799d9-s9hpf\" (UID: \"d96bb751-7faf-40c6-b20a-4ce5e5009c3e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" Apr 22 19:25:41.102235 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.102205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qshks\" (UniqueName: \"kubernetes.io/projected/d96bb751-7faf-40c6-b20a-4ce5e5009c3e-kube-api-access-qshks\") pod \"migrator-74bb7799d9-s9hpf\" (UID: \"d96bb751-7faf-40c6-b20a-4ce5e5009c3e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" Apr 22 19:25:41.110920 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.110898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshks\" (UniqueName: \"kubernetes.io/projected/d96bb751-7faf-40c6-b20a-4ce5e5009c3e-kube-api-access-qshks\") pod \"migrator-74bb7799d9-s9hpf\" (UID: \"d96bb751-7faf-40c6-b20a-4ce5e5009c3e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" Apr 22 19:25:41.192388 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.192363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" Apr 22 19:25:41.304199 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.304164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf"] Apr 22 19:25:41.307420 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:25:41.307392 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd96bb751_7faf_40c6_b20a_4ce5e5009c3e.slice/crio-a3132ceab21199b609dd6f13cba8712a2a6cb62c6f221cadaae4292bbf4e0ae1 WatchSource:0}: Error finding container a3132ceab21199b609dd6f13cba8712a2a6cb62c6f221cadaae4292bbf4e0ae1: Status 404 returned error can't find the container with id a3132ceab21199b609dd6f13cba8712a2a6cb62c6f221cadaae4292bbf4e0ae1 Apr 22 19:25:41.501337 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:41.501305 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cr9b2" podUID="b3237a2b-dfbc-4c60-8166-c94d61b4467f" Apr 22 19:25:41.514656 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:41.514610 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bv9cr" podUID="75bc368d-1a1a-4f77-9a39-1a1b256f1eb6" Apr 22 19:25:41.671006 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.670981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:25:41.671342 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.671330 2572 scope.go:117] "RemoveContainer" containerID="11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053" Apr 22 19:25:41.671560 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:41.671537 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-294vl_openshift-console-operator(94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" podUID="94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1" Apr 22 19:25:41.672173 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.672150 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" event={"ID":"d96bb751-7faf-40c6-b20a-4ce5e5009c3e","Type":"ContainerStarted","Data":"a3132ceab21199b609dd6f13cba8712a2a6cb62c6f221cadaae4292bbf4e0ae1"} Apr 22 19:25:41.672231 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:41.672184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cr9b2" Apr 22 19:25:42.192490 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:42.192454 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-w244z" podUID="def7cd86-6b79-4c5f-900f-f09644520b6b" Apr 22 19:25:42.675202 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:42.675169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" event={"ID":"d96bb751-7faf-40c6-b20a-4ce5e5009c3e","Type":"ContainerStarted","Data":"0d86938b31f8d29e112ed6be6fb934342ea18618c345d0b41b55f90bed0b2b8b"} Apr 22 19:25:42.675511 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:42.675207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" event={"ID":"d96bb751-7faf-40c6-b20a-4ce5e5009c3e","Type":"ContainerStarted","Data":"5cf87957021900b945751c03036441f30a608322d75c35e8a4d42ff9c447bb93"} Apr 22 19:25:42.692156 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:42.692109 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-s9hpf" podStartSLOduration=1.669765298 podStartE2EDuration="2.692095956s" podCreationTimestamp="2026-04-22 19:25:40 +0000 UTC" firstStartedPulling="2026-04-22 19:25:41.309252258 +0000 UTC m=+155.718666752" lastFinishedPulling="2026-04-22 19:25:42.331582916 +0000 UTC m=+156.740997410" observedRunningTime="2026-04-22 19:25:42.690192155 +0000 UTC m=+157.099606668" watchObservedRunningTime="2026-04-22 19:25:42.692095956 +0000 UTC m=+157.101510469" Apr 22 19:25:44.146499 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:44.146466 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zv45q_2c867e00-703c-41c3-8964-4eee7b3451c9/dns-node-resolver/0.log" Apr 22 19:25:44.324613 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:44.324583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:44.324772 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:44.324623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:44.324772 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:44.324736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:44.324772 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.324736 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:44.324915 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.324776 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:44.324915 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.324813 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:52.324794005 +0000 UTC m=+166.734208513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:44.324915 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.324834 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls podName:7782c208-eb03-4951-80cf-8089fcbf8cb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:52.324827179 +0000 UTC m=+166.734241670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-95jw8" (UID: "7782c208-eb03-4951-80cf-8089fcbf8cb4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:44.324915 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.324846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs podName:c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:52.324838684 +0000 UTC m=+166.734253175 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs") pod "router-default-74d655c858-jl4ng" (UID: "c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5") : secret "router-metrics-certs-default" not found Apr 22 19:25:44.425138 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:44.425114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:44.425246 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.425221 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:44.425300 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:44.425263 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls podName:f550de78-8350-40bf-9d1b-4014d7f2fd62 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:52.425251922 +0000 UTC m=+166.834666412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8l2ks" (UID: "f550de78-8350-40bf-9d1b-4014d7f2fd62") : secret "samples-operator-tls" not found Apr 22 19:25:45.144041 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:45.144013 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d9v44_1130a4f2-a77f-4484-b9a9-046f3553e57b/node-ca/0.log" Apr 22 19:25:46.144459 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.144431 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-s9hpf_d96bb751-7faf-40c6-b20a-4ce5e5009c3e/migrator/0.log" Apr 22 19:25:46.338832 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.338803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:25:46.338979 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.338849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:25:46.338979 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:46.338966 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:46.339058 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:46.339021 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls podName:b3237a2b-dfbc-4c60-8166-c94d61b4467f nodeName:}" failed. No retries permitted until 2026-04-22 19:27:48.339005511 +0000 UTC m=+282.748420027 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls") pod "dns-default-cr9b2" (UID: "b3237a2b-dfbc-4c60-8166-c94d61b4467f") : secret "dns-default-metrics-tls" not found Apr 22 19:25:46.339100 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:46.339060 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:46.339143 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:46.339104 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert podName:75bc368d-1a1a-4f77-9a39-1a1b256f1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:48.339092871 +0000 UTC m=+282.748507362 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert") pod "ingress-canary-bv9cr" (UID: "75bc368d-1a1a-4f77-9a39-1a1b256f1eb6") : secret "canary-serving-cert" not found Apr 22 19:25:46.346248 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.346225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-s9hpf_d96bb751-7faf-40c6-b20a-4ce5e5009c3e/graceful-termination/0.log" Apr 22 19:25:46.972127 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.972094 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:46.972301 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.972190 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:46.972453 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:46.972440 2572 scope.go:117] "RemoveContainer" containerID="11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053" Apr 22 19:25:46.972608 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:46.972591 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-294vl_openshift-console-operator(94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" podUID="94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1" Apr 22 19:25:47.686644 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:47.686614 2572 scope.go:117] "RemoveContainer" containerID="11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053" Apr 22 19:25:47.687080 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:47.686806 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-294vl_openshift-console-operator(94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" podUID="94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1" Apr 22 19:25:52.378926 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.378879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:52.379293 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.378944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:25:52.379293 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.378978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:52.379293 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:52.379087 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:52.379293 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:25:52.379154 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls podName:7782c208-eb03-4951-80cf-8089fcbf8cb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:08.379134015 +0000 UTC m=+182.788548506 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-95jw8" (UID: "7782c208-eb03-4951-80cf-8089fcbf8cb4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:52.379602 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.379583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-service-ca-bundle\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:52.381143 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.381119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5-metrics-certs\") pod \"router-default-74d655c858-jl4ng\" (UID: \"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5\") " pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:52.474199 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.474177 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:52.479899 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.479876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:52.482307 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.482284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f550de78-8350-40bf-9d1b-4014d7f2fd62-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8l2ks\" (UID: \"f550de78-8350-40bf-9d1b-4014d7f2fd62\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:52.575967 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.575942 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" Apr 22 19:25:52.598369 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.598341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74d655c858-jl4ng"] Apr 22 19:25:52.601492 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:25:52.601469 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc452dcdb_3433_4c9c_a0e5_3f3bea4be3d5.slice/crio-3455bab3879fd18b3ea2ec088b70161633afd8702b59571b170e122bdef73e0b WatchSource:0}: Error finding container 3455bab3879fd18b3ea2ec088b70161633afd8702b59571b170e122bdef73e0b: Status 404 returned error can't find the container with id 3455bab3879fd18b3ea2ec088b70161633afd8702b59571b170e122bdef73e0b Apr 22 19:25:52.694176 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.694147 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks"] Apr 22 19:25:52.698598 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.698573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74d655c858-jl4ng" event={"ID":"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5","Type":"ContainerStarted","Data":"ca3ba81b4c27fa983618253a84b533534be8c5d3aec97234361fcd05a7764864"} Apr 22 19:25:52.698714 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:52.698603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74d655c858-jl4ng" event={"ID":"c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5","Type":"ContainerStarted","Data":"3455bab3879fd18b3ea2ec088b70161633afd8702b59571b170e122bdef73e0b"} Apr 22 19:25:53.165678 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.165635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:25:53.475393 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.475319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:53.478225 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.478203 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:53.497649 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.497591 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-74d655c858-jl4ng" podStartSLOduration=17.497575825 podStartE2EDuration="17.497575825s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:52.743807839 +0000 UTC m=+167.153222351" watchObservedRunningTime="2026-04-22 19:25:53.497575825 +0000 UTC m=+167.906990340" Apr 22 19:25:53.701984 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.701945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" event={"ID":"f550de78-8350-40bf-9d1b-4014d7f2fd62","Type":"ContainerStarted","Data":"1c68bd04a770a5057641daca2318c39b4ee4b652dfa4bf18b282b288c4f3f956"} Apr 22 19:25:53.702201 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.702178 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:53.703469 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:53.703446 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-74d655c858-jl4ng" Apr 22 19:25:54.706156 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:54.706127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" event={"ID":"f550de78-8350-40bf-9d1b-4014d7f2fd62","Type":"ContainerStarted","Data":"dc1f15e07748898798083cd9cb7481e58e0cc9a9ed081dc18074625e934ef588"} Apr 22 19:25:54.706551 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:54.706163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" event={"ID":"f550de78-8350-40bf-9d1b-4014d7f2fd62","Type":"ContainerStarted","Data":"f10841d9d9b1472c1f5bccbf2e59a3d7a84f9d90122451a8f83998eb161f1fd0"} Apr 22 19:25:54.727832 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:54.727795 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8l2ks" podStartSLOduration=17.12827659 podStartE2EDuration="18.727782731s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:25:52.731286498 +0000 UTC m=+167.140700988" lastFinishedPulling="2026-04-22 19:25:54.330792638 +0000 UTC m=+168.740207129" observedRunningTime="2026-04-22 19:25:54.726519569 +0000 UTC m=+169.135934107" watchObservedRunningTime="2026-04-22 19:25:54.727782731 +0000 UTC m=+169.137197244" Apr 22 19:25:57.166491 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:57.166419 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:25:59.166500 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:59.166454 2572 scope.go:117] "RemoveContainer" containerID="11667c2425afea64097218d361a0c07dcb7093422bb4e69dab8215ef99162053" Apr 22 19:25:59.721508 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:59.721484 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:25:59.721795 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:59.721555 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" event={"ID":"94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1","Type":"ContainerStarted","Data":"ede301610c4b5b8e66ac939b93776bc3a8fe5570ef0d5a67543c67874607e92f"} Apr 22 19:25:59.721882 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:59.721836 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:59.730160 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:59.730137 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" Apr 22 19:25:59.748764 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:25:59.748725 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-294vl" podStartSLOduration=22.127800576 podStartE2EDuration="23.748712737s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:25:37.088894249 +0000 UTC m=+151.498308740" lastFinishedPulling="2026-04-22 19:25:38.709806411 +0000 UTC m=+153.119220901" observedRunningTime="2026-04-22 19:25:59.747806466 +0000 UTC m=+174.157220979" watchObservedRunningTime="2026-04-22 19:25:59.748712737 +0000 UTC m=+174.158127250" Apr 22 19:26:04.216149 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.216124 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j72zs"] Apr 22 19:26:04.219298 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.219282 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.223505 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.223487 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:26:04.223611 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.223523 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:26:04.224576 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.224553 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:26:04.224654 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.224577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2cndx\"" Apr 22 19:26:04.224737 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.224564 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:26:04.238716 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.238640 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j72zs"] Apr 22 19:26:04.367275 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.367249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64c67afe-5ac1-489a-897a-9fcfbfb51c30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.367389 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.367307 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64c67afe-5ac1-489a-897a-9fcfbfb51c30-crio-socket\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.367389 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.367328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgwv\" (UniqueName: \"kubernetes.io/projected/64c67afe-5ac1-489a-897a-9fcfbfb51c30-kube-api-access-scgwv\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.367476 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.367410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64c67afe-5ac1-489a-897a-9fcfbfb51c30-data-volume\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.367476 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.367436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64c67afe-5ac1-489a-897a-9fcfbfb51c30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468001 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.467941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64c67afe-5ac1-489a-897a-9fcfbfb51c30-data-volume\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468001 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.467976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64c67afe-5ac1-489a-897a-9fcfbfb51c30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468172 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.468023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64c67afe-5ac1-489a-897a-9fcfbfb51c30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468172 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.468083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64c67afe-5ac1-489a-897a-9fcfbfb51c30-crio-socket\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468172 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.468109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scgwv\" (UniqueName: \"kubernetes.io/projected/64c67afe-5ac1-489a-897a-9fcfbfb51c30-kube-api-access-scgwv\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468172 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.468164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64c67afe-5ac1-489a-897a-9fcfbfb51c30-crio-socket\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468334 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.468313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64c67afe-5ac1-489a-897a-9fcfbfb51c30-data-volume\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.468560 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.468543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64c67afe-5ac1-489a-897a-9fcfbfb51c30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.470239 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.470216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64c67afe-5ac1-489a-897a-9fcfbfb51c30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.476756 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.476732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgwv\" (UniqueName: \"kubernetes.io/projected/64c67afe-5ac1-489a-897a-9fcfbfb51c30-kube-api-access-scgwv\") pod \"insights-runtime-extractor-j72zs\" (UID: \"64c67afe-5ac1-489a-897a-9fcfbfb51c30\") " pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.528704 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.528683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j72zs" Apr 22 19:26:04.653840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.653809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j72zs"] Apr 22 19:26:04.658714 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:26:04.658690 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64c67afe_5ac1_489a_897a_9fcfbfb51c30.slice/crio-bcd423fd6fc0c192b9dce5d83edbc2e04c18da82ad44ebb6ce49faae3cabfd65 WatchSource:0}: Error finding container bcd423fd6fc0c192b9dce5d83edbc2e04c18da82ad44ebb6ce49faae3cabfd65: Status 404 returned error can't find the container with id bcd423fd6fc0c192b9dce5d83edbc2e04c18da82ad44ebb6ce49faae3cabfd65 Apr 22 19:26:04.734160 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.734108 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j72zs" event={"ID":"64c67afe-5ac1-489a-897a-9fcfbfb51c30","Type":"ContainerStarted","Data":"0aa9ee9431029d2423555d2213e7978d7933976e7a8d3542b0442590492a4421"} Apr 22 19:26:04.734160 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:04.734141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j72zs" event={"ID":"64c67afe-5ac1-489a-897a-9fcfbfb51c30","Type":"ContainerStarted","Data":"bcd423fd6fc0c192b9dce5d83edbc2e04c18da82ad44ebb6ce49faae3cabfd65"} Apr 22 19:26:05.740761 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:05.740722 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j72zs" event={"ID":"64c67afe-5ac1-489a-897a-9fcfbfb51c30","Type":"ContainerStarted","Data":"018eadccc416fbd6d19fdbb05595be4bb33316241be87553443503fc95406ed3"} Apr 22 19:26:06.744203 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:06.744171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j72zs" event={"ID":"64c67afe-5ac1-489a-897a-9fcfbfb51c30","Type":"ContainerStarted","Data":"0ba67046aae79bd576e0a0ce2bb277323b3266001e04cb5a956ead82250c9a7a"} Apr 22 19:26:06.800759 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:06.800713 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j72zs" podStartSLOduration=0.845505422 podStartE2EDuration="2.800699435s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:04.710790625 +0000 UTC m=+179.120205116" lastFinishedPulling="2026-04-22 19:26:06.665984638 +0000 UTC m=+181.075399129" observedRunningTime="2026-04-22 19:26:06.792505899 +0000 UTC m=+181.201920439" watchObservedRunningTime="2026-04-22 19:26:06.800699435 +0000 UTC m=+181.210113948" Apr 22 19:26:08.397852 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:08.397815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:26:08.400268 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:08.400232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7782c208-eb03-4951-80cf-8089fcbf8cb4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-95jw8\" (UID: \"7782c208-eb03-4951-80cf-8089fcbf8cb4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:26:08.671394 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:08.671318 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jvwkk\"" Apr 22 19:26:08.679674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:08.679643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" Apr 22 19:26:08.805849 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:08.805821 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8"] Apr 22 19:26:08.808878 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:26:08.808853 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7782c208_eb03_4951_80cf_8089fcbf8cb4.slice/crio-3e99f97ef8a208a260ed49f04c3b77efbc517c69592cd67b8f9589f63b893cb5 WatchSource:0}: Error finding container 3e99f97ef8a208a260ed49f04c3b77efbc517c69592cd67b8f9589f63b893cb5: Status 404 returned error can't find the container with id 3e99f97ef8a208a260ed49f04c3b77efbc517c69592cd67b8f9589f63b893cb5 Apr 22 19:26:09.755638 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:09.755604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" event={"ID":"7782c208-eb03-4951-80cf-8089fcbf8cb4","Type":"ContainerStarted","Data":"3e99f97ef8a208a260ed49f04c3b77efbc517c69592cd67b8f9589f63b893cb5"} Apr 22 19:26:10.763284 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:10.763247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" event={"ID":"7782c208-eb03-4951-80cf-8089fcbf8cb4","Type":"ContainerStarted","Data":"a43b23499b32def682c9dc2ec5bda79d1190c7af6215a038aaad8e52e7db5820"} Apr 22 19:26:10.783569 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:10.783528 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-95jw8" podStartSLOduration=33.270870152 podStartE2EDuration="34.783516775s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:26:08.810739125 +0000 UTC m=+183.220153615" lastFinishedPulling="2026-04-22 19:26:10.323385733 +0000 UTC m=+184.732800238" observedRunningTime="2026-04-22 19:26:10.782684852 +0000 UTC m=+185.192099355" watchObservedRunningTime="2026-04-22 19:26:10.783516775 +0000 UTC m=+185.192931301" Apr 22 19:26:27.349284 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.349249 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dc29n"] Apr 22 19:26:27.352825 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.352796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.355835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.355586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:26:27.355835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.355633 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-v96mz\"" Apr 22 19:26:27.356446 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.356425 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:26:27.356565 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.356433 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:26:27.356565 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.356521 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:26:27.357634 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.357615 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mc82c"] Apr 22 19:26:27.360574 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.360558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.363365 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.363347 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:26:27.363449 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.363432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-hbmxp\"" Apr 22 19:26:27.363686 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.363654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:26:27.363768 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.363741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:26:27.373794 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.373777 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mc82c"] Apr 22 19:26:27.529612 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529582 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-textfile\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.529743 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.529743 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g742s\" (UniqueName: \"kubernetes.io/projected/31037d76-505c-4078-9baf-70abb3048ac9-kube-api-access-g742s\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.529835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.529835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31037d76-505c-4078-9baf-70abb3048ac9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.529835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-sys\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.529978 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jct7\" (UniqueName: \"kubernetes.io/projected/bff67443-d8bc-4920-819f-6c767147500e-kube-api-access-4jct7\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.529978 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-accelerators-collector-config\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.530076 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.529985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.530076 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.530014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.530076 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.530045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-root\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.530076 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.530071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-wtmp\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.530245 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.530103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/31037d76-505c-4078-9baf-70abb3048ac9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.530245 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.530130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.530245 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.530175 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bff67443-d8bc-4920-819f-6c767147500e-metrics-client-ca\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631051 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.630988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.631051 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g742s\" (UniqueName: \"kubernetes.io/projected/31037d76-505c-4078-9baf-70abb3048ac9-kube-api-access-g742s\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.631051 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31037d76-505c-4078-9baf-70abb3048ac9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.631247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-sys\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631355 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jct7\" (UniqueName: \"kubernetes.io/projected/bff67443-d8bc-4920-819f-6c767147500e-kube-api-access-4jct7\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631355 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-sys\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631355 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-accelerators-collector-config\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631500 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631500 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.631500 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-root\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-wtmp\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:26:27.631526 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631536 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/31037d76-505c-4078-9baf-70abb3048ac9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:26:27.631591 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls podName:bff67443-d8bc-4920-819f-6c767147500e nodeName:}" failed. No retries permitted until 2026-04-22 19:26:28.131572269 +0000 UTC m=+202.540986774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls") pod "node-exporter-dc29n" (UID: "bff67443-d8bc-4920-819f-6c767147500e") : secret "node-exporter-tls" not found Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:26:27.631595 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-root\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.631653 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bff67443-d8bc-4920-819f-6c767147500e-metrics-client-ca\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-textfile\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-wtmp\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31037d76-505c-4078-9baf-70abb3048ac9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:26:27.631888 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-tls podName:31037d76-505c-4078-9baf-70abb3048ac9 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:28.131872362 +0000 UTC m=+202.541286854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mc82c" (UID: "31037d76-505c-4078-9baf-70abb3048ac9") : secret "kube-state-metrics-tls" not found Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.631987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-accelerators-collector-config\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.632014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-textfile\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.632055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.632019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/31037d76-505c-4078-9baf-70abb3048ac9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.632290 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.632164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bff67443-d8bc-4920-819f-6c767147500e-metrics-client-ca\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.634201 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.634177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.634481 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.634466 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:27.640190 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.640168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g742s\" (UniqueName: \"kubernetes.io/projected/31037d76-505c-4078-9baf-70abb3048ac9-kube-api-access-g742s\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:27.641244 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:27.641224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jct7\" (UniqueName: \"kubernetes.io/projected/bff67443-d8bc-4920-819f-6c767147500e-kube-api-access-4jct7\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:28.135484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:28.135457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:28.135653 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:28.135492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:28.135653 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:26:28.135608 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:26:28.135804 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:26:28.135702 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls podName:bff67443-d8bc-4920-819f-6c767147500e nodeName:}" failed. No retries permitted until 2026-04-22 19:26:29.135682567 +0000 UTC m=+203.545097072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls") pod "node-exporter-dc29n" (UID: "bff67443-d8bc-4920-819f-6c767147500e") : secret "node-exporter-tls" not found Apr 22 19:26:28.137752 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:28.137732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/31037d76-505c-4078-9baf-70abb3048ac9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mc82c\" (UID: \"31037d76-505c-4078-9baf-70abb3048ac9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:28.269248 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:28.269224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" Apr 22 19:26:28.432257 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:28.432233 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mc82c"] Apr 22 19:26:28.434332 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:26:28.434308 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31037d76_505c_4078_9baf_70abb3048ac9.slice/crio-832b5d243b9760bbf3b9581d1d5f0cb6efacc5c2394fc7b2422acab3c56c34fb WatchSource:0}: Error finding container 832b5d243b9760bbf3b9581d1d5f0cb6efacc5c2394fc7b2422acab3c56c34fb: Status 404 returned error can't find the container with id 832b5d243b9760bbf3b9581d1d5f0cb6efacc5c2394fc7b2422acab3c56c34fb Apr 22 19:26:28.810933 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:28.810865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" event={"ID":"31037d76-505c-4078-9baf-70abb3048ac9","Type":"ContainerStarted","Data":"832b5d243b9760bbf3b9581d1d5f0cb6efacc5c2394fc7b2422acab3c56c34fb"} Apr 22 19:26:29.144168 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:29.144076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:29.146738 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:29.146710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bff67443-d8bc-4920-819f-6c767147500e-node-exporter-tls\") pod \"node-exporter-dc29n\" (UID: \"bff67443-d8bc-4920-819f-6c767147500e\") " pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:29.162714 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:29.162691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dc29n" Apr 22 19:26:29.172348 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:26:29.172309 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff67443_d8bc_4920_819f_6c767147500e.slice/crio-2f9401a595763d399c3d91d99218bdbd1eed21108778b7b84319789eef74b690 WatchSource:0}: Error finding container 2f9401a595763d399c3d91d99218bdbd1eed21108778b7b84319789eef74b690: Status 404 returned error can't find the container with id 2f9401a595763d399c3d91d99218bdbd1eed21108778b7b84319789eef74b690 Apr 22 19:26:29.824757 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:29.824709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" event={"ID":"31037d76-505c-4078-9baf-70abb3048ac9","Type":"ContainerStarted","Data":"8758bc882a2ba4ea163f8029fb8f891bcd8e2cc478db4f14004e4d802250bdb3"} Apr 22 19:26:29.824757 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:29.824749 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" event={"ID":"31037d76-505c-4078-9baf-70abb3048ac9","Type":"ContainerStarted","Data":"d27cbe9c013d03a1c550e3520dbbe936b241b841cfc9e643acb155bf5164ec2d"} Apr 22 19:26:29.826534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:29.826508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc29n" event={"ID":"bff67443-d8bc-4920-819f-6c767147500e","Type":"ContainerStarted","Data":"2f9401a595763d399c3d91d99218bdbd1eed21108778b7b84319789eef74b690"} Apr 22 19:26:30.830238 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:30.830205 2572 generic.go:358] "Generic (PLEG): container finished" podID="bff67443-d8bc-4920-819f-6c767147500e" containerID="041e68d20779b0ab28b24fdbf99a2d731a83ac18f1893f370be3d86238d7be60" exitCode=0 Apr 22 19:26:30.830727 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:30.830303 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc29n" event={"ID":"bff67443-d8bc-4920-819f-6c767147500e","Type":"ContainerDied","Data":"041e68d20779b0ab28b24fdbf99a2d731a83ac18f1893f370be3d86238d7be60"} Apr 22 19:26:30.832270 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:30.832247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" event={"ID":"31037d76-505c-4078-9baf-70abb3048ac9","Type":"ContainerStarted","Data":"08813ec6b506d148618dd04dbabe1d0caed6489c622d125ec4ce476ad90eb5e0"} Apr 22 19:26:30.869580 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:30.869530 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mc82c" podStartSLOduration=2.667652805 podStartE2EDuration="3.869514244s" podCreationTimestamp="2026-04-22 19:26:27 +0000 UTC" firstStartedPulling="2026-04-22 19:26:28.436223476 +0000 UTC m=+202.845637966" lastFinishedPulling="2026-04-22 19:26:29.638084901 +0000 UTC m=+204.047499405" observedRunningTime="2026-04-22 19:26:30.869030311 +0000 UTC m=+205.278444826" watchObservedRunningTime="2026-04-22 19:26:30.869514244 +0000 UTC m=+205.278928757" Apr 22 19:26:31.836212 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:31.836172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc29n" event={"ID":"bff67443-d8bc-4920-819f-6c767147500e","Type":"ContainerStarted","Data":"e07147ea716f4382f8651698463b0be3858bde6c021005a8d914f1822db37d91"} Apr 22 19:26:31.836212 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:31.836205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc29n" event={"ID":"bff67443-d8bc-4920-819f-6c767147500e","Type":"ContainerStarted","Data":"94b03aecbe842713963938cd3491ff994e71d11e67c7d89a4cf4765d4173ab69"} Apr 22 19:26:31.856804 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:31.856757 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dc29n" podStartSLOduration=3.899474151 podStartE2EDuration="4.856743636s" podCreationTimestamp="2026-04-22 19:26:27 +0000 UTC" firstStartedPulling="2026-04-22 19:26:29.174462299 +0000 UTC m=+203.583876803" lastFinishedPulling="2026-04-22 19:26:30.131731784 +0000 UTC m=+204.541146288" observedRunningTime="2026-04-22 19:26:31.855036628 +0000 UTC m=+206.264451136" watchObservedRunningTime="2026-04-22 19:26:31.856743636 +0000 UTC m=+206.266158149" Apr 22 19:26:33.630797 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.630766 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:26:33.638523 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.638498 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.642553 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.642522 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:26:33.642732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.642576 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:26:33.642732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.642522 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:26:33.642732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.642652 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:26:33.642732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.642682 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:26:33.642732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.642586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:26:33.643514 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.643487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:26:33.643785 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.643766 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1ccbdn6cu4bb4\"" Apr 22 19:26:33.643785 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.643792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:26:33.643967 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.643826 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:26:33.643967 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.643894 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:26:33.644952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.644917 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7rzcw\"" Apr 22 19:26:33.645081 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.645023 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:26:33.647918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.647899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:26:33.651135 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.651119 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:26:33.656876 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.656857 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:26:33.783165 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783267 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783267 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/826e6e59-5db2-4b03-9772-c97128b47870-config-out\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-web-config\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783423 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4ww\" (UniqueName: \"kubernetes.io/projected/826e6e59-5db2-4b03-9772-c97128b47870-kube-api-access-5t4ww\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-config\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/826e6e59-5db2-4b03-9772-c97128b47870-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.783655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.783642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884593 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884593 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884593 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/826e6e59-5db2-4b03-9772-c97128b47870-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/826e6e59-5db2-4b03-9772-c97128b47870-config-out\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.884793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-web-config\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.884980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4ww\" (UniqueName: \"kubernetes.io/projected/826e6e59-5db2-4b03-9772-c97128b47870-kube-api-access-5t4ww\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.885011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-config\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.885770 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.885746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.887121 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.887092 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.887882 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.887861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/826e6e59-5db2-4b03-9772-c97128b47870-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.888214 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.888179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.888914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.888805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.888914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.888869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.888914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.888893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.889914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.889303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.889914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.889467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.889914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.889715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-config\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.890243 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.890213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.890352 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.890294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/826e6e59-5db2-4b03-9772-c97128b47870-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.890352 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.890319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.890686 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.890647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.890866 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.890848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/826e6e59-5db2-4b03-9772-c97128b47870-config-out\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.891436 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.891407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-web-config\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.891734 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.891716 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/826e6e59-5db2-4b03-9772-c97128b47870-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.896822 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.896804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4ww\" (UniqueName: \"kubernetes.io/projected/826e6e59-5db2-4b03-9772-c97128b47870-kube-api-access-5t4ww\") pod \"prometheus-k8s-0\" (UID: \"826e6e59-5db2-4b03-9772-c97128b47870\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:33.947555 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:33.947535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:34.069590 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:34.069561 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:26:34.074794 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:26:34.074770 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod826e6e59_5db2_4b03_9772_c97128b47870.slice/crio-6d507f186de39cefb9a674fa613c4e38297422a0c189f2490bcf076106163861 WatchSource:0}: Error finding container 6d507f186de39cefb9a674fa613c4e38297422a0c189f2490bcf076106163861: Status 404 returned error can't find the container with id 6d507f186de39cefb9a674fa613c4e38297422a0c189f2490bcf076106163861 Apr 22 19:26:34.845514 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:34.845475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"6d507f186de39cefb9a674fa613c4e38297422a0c189f2490bcf076106163861"} Apr 22 19:26:35.849777 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:35.849742 2572 generic.go:358] "Generic (PLEG): container finished" podID="826e6e59-5db2-4b03-9772-c97128b47870" containerID="4b3196d2631ca25367c627465b6ea53659785db5e165b76f7a255ec4d8ccfdc1" exitCode=0 Apr 22 19:26:35.850133 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:35.849783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerDied","Data":"4b3196d2631ca25367c627465b6ea53659785db5e165b76f7a255ec4d8ccfdc1"} Apr 22 19:26:38.859272 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:38.859237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"ef229c7bf30649300551a0c22c9d1dbb7ae89080d1f6b8436d68c4b7a879bb1f"} Apr 22 19:26:38.859642 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:38.859278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"9b31f50acd2a62e59039834e5c3317afc8a00e2cfa3e759c1c9e5a7edfbbc219"} Apr 22 19:26:40.872486 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:40.872449 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"84f4bed10047fbc7a7c5f8b97a0a82aadfc87e60131dafd3deec2367d9109e73"} Apr 22 19:26:40.872486 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:40.872488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"ec39ba82a63acc60d6502c18678566f37f8c680238c64c91a902a132295faf72"} Apr 22 19:26:40.872968 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:40.872500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"6ecdd44aca7a8e5cfd25afab2d910fc272e98782c1f1f91f804a38c83294c435"} Apr 22 19:26:40.872968 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:40.872511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"826e6e59-5db2-4b03-9772-c97128b47870","Type":"ContainerStarted","Data":"603b677b8508394941c6894671f292dd8c974c327e9f23de6e1468c50b0a93f5"} Apr 22 19:26:40.905915 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:40.905868 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.991759993 podStartE2EDuration="7.905850293s" podCreationTimestamp="2026-04-22 19:26:33 +0000 UTC" firstStartedPulling="2026-04-22 19:26:34.076712857 +0000 UTC m=+208.486127348" lastFinishedPulling="2026-04-22 19:26:39.990803143 +0000 UTC m=+214.400217648" observedRunningTime="2026-04-22 19:26:40.904205139 +0000 UTC m=+215.313619651" watchObservedRunningTime="2026-04-22 19:26:40.905850293 +0000 UTC m=+215.315264806" Apr 22 19:26:43.947686 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:26:43.947643 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:01.572735 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:01.572707 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zv45q_2c867e00-703c-41c3-8964-4eee7b3451c9/dns-node-resolver/0.log" Apr 22 19:27:17.004994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:17.004955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:27:17.007401 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:17.007378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def7cd86-6b79-4c5f-900f-f09644520b6b-metrics-certs\") pod \"network-metrics-daemon-w244z\" (UID: \"def7cd86-6b79-4c5f-900f-f09644520b6b\") " pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:27:17.169506 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:17.169482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ff584\"" Apr 22 19:27:17.176809 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:17.176793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w244z" Apr 22 19:27:17.290590 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:17.290543 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w244z"] Apr 22 19:27:17.293456 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:27:17.293426 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef7cd86_6b79_4c5f_900f_f09644520b6b.slice/crio-ee89ada4ec05730724273b2894f89acc3cd2e52533137a39184ff6e700d90926 WatchSource:0}: Error finding container ee89ada4ec05730724273b2894f89acc3cd2e52533137a39184ff6e700d90926: Status 404 returned error can't find the container with id ee89ada4ec05730724273b2894f89acc3cd2e52533137a39184ff6e700d90926 Apr 22 19:27:17.969418 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:17.969378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w244z" event={"ID":"def7cd86-6b79-4c5f-900f-f09644520b6b","Type":"ContainerStarted","Data":"ee89ada4ec05730724273b2894f89acc3cd2e52533137a39184ff6e700d90926"} Apr 22 19:27:18.976102 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:18.976057 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w244z" event={"ID":"def7cd86-6b79-4c5f-900f-f09644520b6b","Type":"ContainerStarted","Data":"2555c5c83dcedc0d284da10a922e44d43fe4ab6b9ebecd046f35ec07edcd7e78"} Apr 22 19:27:18.976102 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:18.976105 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w244z" event={"ID":"def7cd86-6b79-4c5f-900f-f09644520b6b","Type":"ContainerStarted","Data":"9a492dd41d91ca4beb320dc00c390c84415d67b3786634b55b84a180b5671f50"} Apr 22 19:27:18.993227 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:18.993170 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-w244z" podStartSLOduration=252.072771923 podStartE2EDuration="4m12.993154589s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:27:17.295709958 +0000 UTC m=+251.705124449" lastFinishedPulling="2026-04-22 19:27:18.21609262 +0000 UTC m=+252.625507115" observedRunningTime="2026-04-22 19:27:18.991780878 +0000 UTC m=+253.401195391" watchObservedRunningTime="2026-04-22 19:27:18.993154589 +0000 UTC m=+253.402569103" Apr 22 19:27:33.948046 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:33.947943 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:33.965024 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:33.964995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:34.031514 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:34.031492 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:44.672933 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:27:44.672891 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cr9b2" podUID="b3237a2b-dfbc-4c60-8166-c94d61b4467f" Apr 22 19:27:45.048514 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:45.048429 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cr9b2" Apr 22 19:27:48.439262 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.439211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:27:48.439262 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.439265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:27:48.441695 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.441649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3237a2b-dfbc-4c60-8166-c94d61b4467f-metrics-tls\") pod \"dns-default-cr9b2\" (UID: \"b3237a2b-dfbc-4c60-8166-c94d61b4467f\") " pod="openshift-dns/dns-default-cr9b2" Apr 22 19:27:48.441929 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.441909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75bc368d-1a1a-4f77-9a39-1a1b256f1eb6-cert\") pod \"ingress-canary-bv9cr\" (UID: \"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6\") " pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:27:48.469909 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.469885 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrkhq\"" Apr 22 19:27:48.477994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.477974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bv9cr" Apr 22 19:27:48.591636 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.591606 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bv9cr"] Apr 22 19:27:48.594857 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:27:48.594831 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75bc368d_1a1a_4f77_9a39_1a1b256f1eb6.slice/crio-1a1fb59d8b6057a491f1fe741c8ee25e4c128250915212fa10ef8bbbc7a2347d WatchSource:0}: Error finding container 1a1fb59d8b6057a491f1fe741c8ee25e4c128250915212fa10ef8bbbc7a2347d: Status 404 returned error can't find the container with id 1a1fb59d8b6057a491f1fe741c8ee25e4c128250915212fa10ef8bbbc7a2347d Apr 22 19:27:48.652219 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.652197 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6f5sr\"" Apr 22 19:27:48.660481 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.660464 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cr9b2" Apr 22 19:27:48.770904 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:48.770883 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cr9b2"] Apr 22 19:27:48.773293 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:27:48.773251 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3237a2b_dfbc_4c60_8166_c94d61b4467f.slice/crio-f0996b3c884db7419bd0d6349398770a62be4cd90bd0ade4f377d91776e5706b WatchSource:0}: Error finding container f0996b3c884db7419bd0d6349398770a62be4cd90bd0ade4f377d91776e5706b: Status 404 returned error can't find the container with id f0996b3c884db7419bd0d6349398770a62be4cd90bd0ade4f377d91776e5706b Apr 22 19:27:49.061080 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:49.060989 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cr9b2" event={"ID":"b3237a2b-dfbc-4c60-8166-c94d61b4467f","Type":"ContainerStarted","Data":"f0996b3c884db7419bd0d6349398770a62be4cd90bd0ade4f377d91776e5706b"} Apr 22 19:27:49.062206 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:49.062166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bv9cr" event={"ID":"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6","Type":"ContainerStarted","Data":"1a1fb59d8b6057a491f1fe741c8ee25e4c128250915212fa10ef8bbbc7a2347d"} Apr 22 19:27:51.069552 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:51.069516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cr9b2" event={"ID":"b3237a2b-dfbc-4c60-8166-c94d61b4467f","Type":"ContainerStarted","Data":"cedc31f5cc5466b466cd68ec9b8cf68d8b29a76201533149639cd5b2fa241872"} Apr 22 19:27:51.069552 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:51.069556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cr9b2" event={"ID":"b3237a2b-dfbc-4c60-8166-c94d61b4467f","Type":"ContainerStarted","Data":"47420ec5d994f014cf75ad08434c8476dc6dd4a991515dd8f88cee0afb366bbb"} Apr 22 19:27:51.070054 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:51.069622 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cr9b2" Apr 22 19:27:51.070786 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:51.070765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bv9cr" event={"ID":"75bc368d-1a1a-4f77-9a39-1a1b256f1eb6","Type":"ContainerStarted","Data":"83298f4290151c9ba22fba1f7a67e87f0930c3453bb36336b0b55d93199c72a4"} Apr 22 19:27:51.087242 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:51.087193 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cr9b2" podStartSLOduration=251.382169777 podStartE2EDuration="4m13.08718102s" podCreationTimestamp="2026-04-22 19:23:38 +0000 UTC" firstStartedPulling="2026-04-22 19:27:48.775161195 +0000 UTC m=+283.184575689" lastFinishedPulling="2026-04-22 19:27:50.480172441 +0000 UTC m=+284.889586932" observedRunningTime="2026-04-22 19:27:51.08611454 +0000 UTC m=+285.495529054" watchObservedRunningTime="2026-04-22 19:27:51.08718102 +0000 UTC m=+285.496595532" Apr 22 19:27:51.101444 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:27:51.101393 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bv9cr" podStartSLOduration=251.213994974 podStartE2EDuration="4m13.101378563s" podCreationTimestamp="2026-04-22 19:23:38 +0000 UTC" firstStartedPulling="2026-04-22 19:27:48.596984497 +0000 UTC m=+283.006398988" lastFinishedPulling="2026-04-22 19:27:50.484368086 +0000 UTC m=+284.893782577" observedRunningTime="2026-04-22 19:27:51.099751419 +0000 UTC m=+285.509165932" watchObservedRunningTime="2026-04-22 19:27:51.101378563 +0000 UTC m=+285.510793077" Apr 22 19:28:01.077046 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:28:01.077008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cr9b2" Apr 22 19:28:06.047931 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:28:06.047896 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:28:06.048449 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:28:06.048430 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:28:06.055639 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:28:06.055614 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:33:06.073146 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:33:06.073120 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:33:06.073677 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:33:06.073127 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:36:07.918297 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.918265 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-k5hv4"] Apr 22 19:36:07.921505 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.921488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:07.925185 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.925162 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:36:07.926361 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.926345 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-ktxrw\"" Apr 22 19:36:07.926447 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.926366 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:36:07.926447 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.926346 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 19:36:07.933214 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.933194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-k5hv4"] Apr 22 19:36:07.985128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.985106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxl6\" (UniqueName: \"kubernetes.io/projected/23afe627-399d-4e32-9760-bac458db5ba8-kube-api-access-qnxl6\") pod \"model-serving-api-86f7b4b499-k5hv4\" (UID: \"23afe627-399d-4e32-9760-bac458db5ba8\") " pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:07.985245 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:07.985140 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23afe627-399d-4e32-9760-bac458db5ba8-tls-certs\") pod \"model-serving-api-86f7b4b499-k5hv4\" (UID: \"23afe627-399d-4e32-9760-bac458db5ba8\") " pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:08.086068 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.086040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxl6\" (UniqueName: \"kubernetes.io/projected/23afe627-399d-4e32-9760-bac458db5ba8-kube-api-access-qnxl6\") pod \"model-serving-api-86f7b4b499-k5hv4\" (UID: \"23afe627-399d-4e32-9760-bac458db5ba8\") " pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:08.086196 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.086084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23afe627-399d-4e32-9760-bac458db5ba8-tls-certs\") pod \"model-serving-api-86f7b4b499-k5hv4\" (UID: \"23afe627-399d-4e32-9760-bac458db5ba8\") " pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:08.088339 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.088317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23afe627-399d-4e32-9760-bac458db5ba8-tls-certs\") pod \"model-serving-api-86f7b4b499-k5hv4\" (UID: \"23afe627-399d-4e32-9760-bac458db5ba8\") " pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:08.094730 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.094711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxl6\" (UniqueName: \"kubernetes.io/projected/23afe627-399d-4e32-9760-bac458db5ba8-kube-api-access-qnxl6\") pod \"model-serving-api-86f7b4b499-k5hv4\" (UID: \"23afe627-399d-4e32-9760-bac458db5ba8\") " pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:08.231470 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.231411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:08.347225 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.347185 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-k5hv4"] Apr 22 19:36:08.350402 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:36:08.350375 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23afe627_399d_4e32_9760_bac458db5ba8.slice/crio-3e4c1ed7802e30b337d995a9caef8098ab4f8220df87edad6d3e0edb48f36276 WatchSource:0}: Error finding container 3e4c1ed7802e30b337d995a9caef8098ab4f8220df87edad6d3e0edb48f36276: Status 404 returned error can't find the container with id 3e4c1ed7802e30b337d995a9caef8098ab4f8220df87edad6d3e0edb48f36276 Apr 22 19:36:08.352429 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.352415 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:36:08.469002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:08.468978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-k5hv4" event={"ID":"23afe627-399d-4e32-9760-bac458db5ba8","Type":"ContainerStarted","Data":"3e4c1ed7802e30b337d995a9caef8098ab4f8220df87edad6d3e0edb48f36276"} Apr 22 19:36:11.478385 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:11.478348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-k5hv4" event={"ID":"23afe627-399d-4e32-9760-bac458db5ba8","Type":"ContainerStarted","Data":"fd62fcbbe1c2a1dc893e5ab2da47f936757ac02cad7416a8801dbe9492aa7e9a"} Apr 22 19:36:11.478778 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:11.478467 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:11.497242 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:11.497195 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-k5hv4" podStartSLOduration=2.27788445 podStartE2EDuration="4.49717863s" podCreationTimestamp="2026-04-22 19:36:07 +0000 UTC" firstStartedPulling="2026-04-22 19:36:08.352535503 +0000 UTC m=+782.761949994" lastFinishedPulling="2026-04-22 19:36:10.57182968 +0000 UTC m=+784.981244174" observedRunningTime="2026-04-22 19:36:11.49620782 +0000 UTC m=+785.905622333" watchObservedRunningTime="2026-04-22 19:36:11.49717863 +0000 UTC m=+785.906593144" Apr 22 19:36:22.484929 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:22.484900 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-k5hv4" Apr 22 19:36:23.530843 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.530801 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-h5lwd"] Apr 22 19:36:23.534084 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.534065 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h5lwd" Apr 22 19:36:23.536721 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.536691 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:36:23.536721 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.536708 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c5d95\"" Apr 22 19:36:23.540380 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.540361 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-h5lwd"] Apr 22 19:36:23.591652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.591625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d45bk\" (UniqueName: \"kubernetes.io/projected/fbca928d-0eee-4a59-a378-4bcc47205b07-kube-api-access-d45bk\") pod \"s3-init-h5lwd\" (UID: \"fbca928d-0eee-4a59-a378-4bcc47205b07\") " pod="kserve/s3-init-h5lwd" Apr 22 19:36:23.692652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.692629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d45bk\" (UniqueName: \"kubernetes.io/projected/fbca928d-0eee-4a59-a378-4bcc47205b07-kube-api-access-d45bk\") pod \"s3-init-h5lwd\" (UID: \"fbca928d-0eee-4a59-a378-4bcc47205b07\") " pod="kserve/s3-init-h5lwd" Apr 22 19:36:23.701467 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.701447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d45bk\" (UniqueName: \"kubernetes.io/projected/fbca928d-0eee-4a59-a378-4bcc47205b07-kube-api-access-d45bk\") pod \"s3-init-h5lwd\" (UID: \"fbca928d-0eee-4a59-a378-4bcc47205b07\") " pod="kserve/s3-init-h5lwd" Apr 22 19:36:23.851080 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.851009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h5lwd" Apr 22 19:36:23.964427 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:23.964395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-h5lwd"] Apr 22 19:36:23.966577 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:36:23.966552 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbca928d_0eee_4a59_a378_4bcc47205b07.slice/crio-091e460ffee9f93bc52e6e77d4b488238e3e059865a1a7c44e269414aecda25c WatchSource:0}: Error finding container 091e460ffee9f93bc52e6e77d4b488238e3e059865a1a7c44e269414aecda25c: Status 404 returned error can't find the container with id 091e460ffee9f93bc52e6e77d4b488238e3e059865a1a7c44e269414aecda25c Apr 22 19:36:24.518766 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:24.518731 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h5lwd" event={"ID":"fbca928d-0eee-4a59-a378-4bcc47205b07","Type":"ContainerStarted","Data":"091e460ffee9f93bc52e6e77d4b488238e3e059865a1a7c44e269414aecda25c"} Apr 22 19:36:28.534446 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:28.534408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h5lwd" event={"ID":"fbca928d-0eee-4a59-a378-4bcc47205b07","Type":"ContainerStarted","Data":"2fdf518edcf430f58143712de6f300e1ff6d4ca96505b92c95e423c4471e4148"} Apr 22 19:36:28.550640 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:28.550599 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-h5lwd" podStartSLOduration=1.140757308 podStartE2EDuration="5.550585807s" podCreationTimestamp="2026-04-22 19:36:23 +0000 UTC" firstStartedPulling="2026-04-22 19:36:23.968781307 +0000 UTC m=+798.378195801" lastFinishedPulling="2026-04-22 19:36:28.378609794 +0000 UTC m=+802.788024300" observedRunningTime="2026-04-22 19:36:28.54935572 +0000 UTC m=+802.958770245" watchObservedRunningTime="2026-04-22 19:36:28.550585807 +0000 UTC m=+802.960000319" Apr 22 19:36:31.544790 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:31.544719 2572 generic.go:358] "Generic (PLEG): container finished" podID="fbca928d-0eee-4a59-a378-4bcc47205b07" containerID="2fdf518edcf430f58143712de6f300e1ff6d4ca96505b92c95e423c4471e4148" exitCode=0 Apr 22 19:36:31.545109 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:31.544789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h5lwd" event={"ID":"fbca928d-0eee-4a59-a378-4bcc47205b07","Type":"ContainerDied","Data":"2fdf518edcf430f58143712de6f300e1ff6d4ca96505b92c95e423c4471e4148"} Apr 22 19:36:32.661983 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:32.661960 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h5lwd" Apr 22 19:36:32.764365 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:32.764336 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d45bk\" (UniqueName: \"kubernetes.io/projected/fbca928d-0eee-4a59-a378-4bcc47205b07-kube-api-access-d45bk\") pod \"fbca928d-0eee-4a59-a378-4bcc47205b07\" (UID: \"fbca928d-0eee-4a59-a378-4bcc47205b07\") " Apr 22 19:36:32.766378 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:32.766356 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbca928d-0eee-4a59-a378-4bcc47205b07-kube-api-access-d45bk" (OuterVolumeSpecName: "kube-api-access-d45bk") pod "fbca928d-0eee-4a59-a378-4bcc47205b07" (UID: "fbca928d-0eee-4a59-a378-4bcc47205b07"). InnerVolumeSpecName "kube-api-access-d45bk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:36:32.864897 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:32.864839 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d45bk\" (UniqueName: \"kubernetes.io/projected/fbca928d-0eee-4a59-a378-4bcc47205b07-kube-api-access-d45bk\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:36:33.551028 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:33.551001 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h5lwd" Apr 22 19:36:33.551028 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:33.551028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h5lwd" event={"ID":"fbca928d-0eee-4a59-a378-4bcc47205b07","Type":"ContainerDied","Data":"091e460ffee9f93bc52e6e77d4b488238e3e059865a1a7c44e269414aecda25c"} Apr 22 19:36:33.551247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:36:33.551059 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="091e460ffee9f93bc52e6e77d4b488238e3e059865a1a7c44e269414aecda25c" Apr 22 19:37:10.659273 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.659237 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-vn9ng"] Apr 22 19:37:10.659858 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.659542 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbca928d-0eee-4a59-a378-4bcc47205b07" containerName="s3-init" Apr 22 19:37:10.659858 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.659555 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbca928d-0eee-4a59-a378-4bcc47205b07" containerName="s3-init" Apr 22 19:37:10.659858 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.659602 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbca928d-0eee-4a59-a378-4bcc47205b07" containerName="s3-init" Apr 22 19:37:10.694403 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.694376 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-vn9ng"] Apr 22 19:37:10.694548 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.694467 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:10.697286 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.697266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c5d95\"" Apr 22 19:37:10.697422 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.697266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 19:37:10.813192 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.813161 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqbvd\" (UniqueName: \"kubernetes.io/projected/4ef1cde4-40cc-4920-9853-ffcd9ebc7560-kube-api-access-vqbvd\") pod \"s3-tls-init-custom-vn9ng\" (UID: \"4ef1cde4-40cc-4920-9853-ffcd9ebc7560\") " pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:10.914484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.914417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqbvd\" (UniqueName: \"kubernetes.io/projected/4ef1cde4-40cc-4920-9853-ffcd9ebc7560-kube-api-access-vqbvd\") pod \"s3-tls-init-custom-vn9ng\" (UID: \"4ef1cde4-40cc-4920-9853-ffcd9ebc7560\") " pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:10.922889 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:10.922865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqbvd\" (UniqueName: \"kubernetes.io/projected/4ef1cde4-40cc-4920-9853-ffcd9ebc7560-kube-api-access-vqbvd\") pod \"s3-tls-init-custom-vn9ng\" (UID: \"4ef1cde4-40cc-4920-9853-ffcd9ebc7560\") " pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:11.013533 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:11.013513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:11.128795 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:11.128773 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-vn9ng"] Apr 22 19:37:11.131080 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:37:11.131054 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ef1cde4_40cc_4920_9853_ffcd9ebc7560.slice/crio-c6f8ed3192f58656977c6dbaa9efd3bbd505a23a2e5558afee2e851fc31464eb WatchSource:0}: Error finding container c6f8ed3192f58656977c6dbaa9efd3bbd505a23a2e5558afee2e851fc31464eb: Status 404 returned error can't find the container with id c6f8ed3192f58656977c6dbaa9efd3bbd505a23a2e5558afee2e851fc31464eb Apr 22 19:37:11.664034 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:11.663999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vn9ng" event={"ID":"4ef1cde4-40cc-4920-9853-ffcd9ebc7560","Type":"ContainerStarted","Data":"61308b2e6773c17c1fd9e831ad071ccf944d212fbc45ce659d0b9b819fccb985"} Apr 22 19:37:11.664381 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:11.664039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vn9ng" event={"ID":"4ef1cde4-40cc-4920-9853-ffcd9ebc7560","Type":"ContainerStarted","Data":"c6f8ed3192f58656977c6dbaa9efd3bbd505a23a2e5558afee2e851fc31464eb"} Apr 22 19:37:11.679887 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:11.679836 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-vn9ng" podStartSLOduration=1.679817468 podStartE2EDuration="1.679817468s" podCreationTimestamp="2026-04-22 19:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:37:11.679297342 +0000 UTC m=+846.088711858" watchObservedRunningTime="2026-04-22 19:37:11.679817468 +0000 UTC m=+846.089231983" Apr 22 19:37:16.678297 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:16.678269 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ef1cde4-40cc-4920-9853-ffcd9ebc7560" containerID="61308b2e6773c17c1fd9e831ad071ccf944d212fbc45ce659d0b9b819fccb985" exitCode=0 Apr 22 19:37:16.678695 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:16.678325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vn9ng" event={"ID":"4ef1cde4-40cc-4920-9853-ffcd9ebc7560","Type":"ContainerDied","Data":"61308b2e6773c17c1fd9e831ad071ccf944d212fbc45ce659d0b9b819fccb985"} Apr 22 19:37:17.801043 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:17.801022 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:17.966122 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:17.966058 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqbvd\" (UniqueName: \"kubernetes.io/projected/4ef1cde4-40cc-4920-9853-ffcd9ebc7560-kube-api-access-vqbvd\") pod \"4ef1cde4-40cc-4920-9853-ffcd9ebc7560\" (UID: \"4ef1cde4-40cc-4920-9853-ffcd9ebc7560\") " Apr 22 19:37:17.967986 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:17.967959 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef1cde4-40cc-4920-9853-ffcd9ebc7560-kube-api-access-vqbvd" (OuterVolumeSpecName: "kube-api-access-vqbvd") pod "4ef1cde4-40cc-4920-9853-ffcd9ebc7560" (UID: "4ef1cde4-40cc-4920-9853-ffcd9ebc7560"). InnerVolumeSpecName "kube-api-access-vqbvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:37:18.071244 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:18.067215 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqbvd\" (UniqueName: \"kubernetes.io/projected/4ef1cde4-40cc-4920-9853-ffcd9ebc7560-kube-api-access-vqbvd\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:37:18.198674 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:37:18.198643 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ef1cde4_40cc_4920_9853_ffcd9ebc7560.slice\": RecentStats: unable to find data in memory cache]" Apr 22 19:37:18.686489 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:18.686455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vn9ng" event={"ID":"4ef1cde4-40cc-4920-9853-ffcd9ebc7560","Type":"ContainerDied","Data":"c6f8ed3192f58656977c6dbaa9efd3bbd505a23a2e5558afee2e851fc31464eb"} Apr 22 19:37:18.686489 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:18.686485 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f8ed3192f58656977c6dbaa9efd3bbd505a23a2e5558afee2e851fc31464eb" Apr 22 19:37:18.686742 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:18.686492 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vn9ng" Apr 22 19:37:21.476043 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.476012 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-fdt9x"] Apr 22 19:37:21.476470 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.476288 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef1cde4-40cc-4920-9853-ffcd9ebc7560" containerName="s3-tls-init-custom" Apr 22 19:37:21.476470 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.476299 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef1cde4-40cc-4920-9853-ffcd9ebc7560" containerName="s3-tls-init-custom" Apr 22 19:37:21.476470 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.476351 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef1cde4-40cc-4920-9853-ffcd9ebc7560" containerName="s3-tls-init-custom" Apr 22 19:37:21.479470 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.479454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:21.482063 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.482036 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 19:37:21.482201 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.482087 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c5d95\"" Apr 22 19:37:21.485087 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.485064 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-fdt9x"] Apr 22 19:37:21.589897 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.589828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh447\" (UniqueName: \"kubernetes.io/projected/8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f-kube-api-access-vh447\") pod \"s3-tls-init-serving-fdt9x\" (UID: \"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f\") " pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:21.690964 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.690941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vh447\" (UniqueName: \"kubernetes.io/projected/8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f-kube-api-access-vh447\") pod \"s3-tls-init-serving-fdt9x\" (UID: \"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f\") " pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:21.699892 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.699864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh447\" (UniqueName: \"kubernetes.io/projected/8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f-kube-api-access-vh447\") pod \"s3-tls-init-serving-fdt9x\" (UID: \"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f\") " pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:21.798046 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.798016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:21.909549 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:21.909522 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-fdt9x"] Apr 22 19:37:21.912407 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:37:21.912381 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e5715bb_8ec7_4f65_b10e_08a16b7d6a3f.slice/crio-87bf41e907ae24c9cd201fbe263bd2c8e57bb20072a5a2fa0a87f7f672521c40 WatchSource:0}: Error finding container 87bf41e907ae24c9cd201fbe263bd2c8e57bb20072a5a2fa0a87f7f672521c40: Status 404 returned error can't find the container with id 87bf41e907ae24c9cd201fbe263bd2c8e57bb20072a5a2fa0a87f7f672521c40 Apr 22 19:37:22.699870 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:22.699830 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fdt9x" event={"ID":"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f","Type":"ContainerStarted","Data":"cd2c817c9ce9738baea1ef56beaae821315ee6af05aec933e91eb8c488a298ee"} Apr 22 19:37:22.699870 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:22.699869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fdt9x" event={"ID":"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f","Type":"ContainerStarted","Data":"87bf41e907ae24c9cd201fbe263bd2c8e57bb20072a5a2fa0a87f7f672521c40"} Apr 22 19:37:22.716891 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:22.716838 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-fdt9x" podStartSLOduration=1.716820343 podStartE2EDuration="1.716820343s" podCreationTimestamp="2026-04-22 19:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:37:22.714925113 +0000 UTC m=+857.124339626" watchObservedRunningTime="2026-04-22 19:37:22.716820343 +0000 UTC m=+857.126234857" Apr 22 19:37:27.718581 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:27.718550 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f" containerID="cd2c817c9ce9738baea1ef56beaae821315ee6af05aec933e91eb8c488a298ee" exitCode=0 Apr 22 19:37:27.718936 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:27.718623 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fdt9x" event={"ID":"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f","Type":"ContainerDied","Data":"cd2c817c9ce9738baea1ef56beaae821315ee6af05aec933e91eb8c488a298ee"} Apr 22 19:37:28.853225 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:28.853205 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:28.943732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:28.943711 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh447\" (UniqueName: \"kubernetes.io/projected/8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f-kube-api-access-vh447\") pod \"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f\" (UID: \"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f\") " Apr 22 19:37:28.945725 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:28.945698 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f-kube-api-access-vh447" (OuterVolumeSpecName: "kube-api-access-vh447") pod "8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f" (UID: "8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f"). InnerVolumeSpecName "kube-api-access-vh447". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:37:29.044456 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:29.044398 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vh447\" (UniqueName: \"kubernetes.io/projected/8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f-kube-api-access-vh447\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:37:29.725320 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:29.725284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fdt9x" event={"ID":"8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f","Type":"ContainerDied","Data":"87bf41e907ae24c9cd201fbe263bd2c8e57bb20072a5a2fa0a87f7f672521c40"} Apr 22 19:37:29.725320 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:29.725284 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fdt9x" Apr 22 19:37:29.725320 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:29.725320 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87bf41e907ae24c9cd201fbe263bd2c8e57bb20072a5a2fa0a87f7f672521c40" Apr 22 19:37:38.846148 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.846114 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc"] Apr 22 19:37:38.846519 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.846391 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f" containerName="s3-tls-init-serving" Apr 22 19:37:38.846519 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.846401 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f" containerName="s3-tls-init-serving" Apr 22 19:37:38.846519 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.846454 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f" containerName="s3-tls-init-serving" Apr 22 19:37:38.849542 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.849524 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:37:38.852347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.852326 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5f4v\"" Apr 22 19:37:38.861046 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:38.861025 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc"] Apr 22 19:37:39.012285 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:39.012255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1819e413-32fa-4b67-afe5-ead6cb7993ba-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-86bf666f98-bz2gc\" (UID: \"1819e413-32fa-4b67-afe5-ead6cb7993ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:37:39.113289 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:39.113220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1819e413-32fa-4b67-afe5-ead6cb7993ba-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-86bf666f98-bz2gc\" (UID: \"1819e413-32fa-4b67-afe5-ead6cb7993ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:37:39.113630 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:39.113612 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1819e413-32fa-4b67-afe5-ead6cb7993ba-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-86bf666f98-bz2gc\" (UID: \"1819e413-32fa-4b67-afe5-ead6cb7993ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:37:39.158817 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:39.158796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:37:39.275694 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:39.275648 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc"] Apr 22 19:37:39.279498 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:37:39.279467 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1819e413_32fa_4b67_afe5_ead6cb7993ba.slice/crio-13265e75887a4bceafa44401404bb37c3ccca467acca0bcdad30bf7bdf6297f8 WatchSource:0}: Error finding container 13265e75887a4bceafa44401404bb37c3ccca467acca0bcdad30bf7bdf6297f8: Status 404 returned error can't find the container with id 13265e75887a4bceafa44401404bb37c3ccca467acca0bcdad30bf7bdf6297f8 Apr 22 19:37:39.757088 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:39.757022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerStarted","Data":"13265e75887a4bceafa44401404bb37c3ccca467acca0bcdad30bf7bdf6297f8"} Apr 22 19:37:42.766657 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:42.766624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerStarted","Data":"359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f"} Apr 22 19:37:45.778615 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:45.778576 2572 generic.go:358] "Generic (PLEG): container finished" podID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerID="359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f" exitCode=0 Apr 22 19:37:45.778952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:45.778636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerDied","Data":"359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f"} Apr 22 19:37:59.826159 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:37:59.826116 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerStarted","Data":"1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14"} Apr 22 19:38:02.836341 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:02.836306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerStarted","Data":"71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff"} Apr 22 19:38:02.836745 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:02.836538 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:38:02.837709 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:02.837643 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:02.855969 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:02.855931 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podStartSLOduration=2.222837917 podStartE2EDuration="24.85591952s" podCreationTimestamp="2026-04-22 19:37:38 +0000 UTC" firstStartedPulling="2026-04-22 19:37:39.281500494 +0000 UTC m=+873.690914990" lastFinishedPulling="2026-04-22 19:38:01.914582088 +0000 UTC m=+896.323996593" observedRunningTime="2026-04-22 19:38:02.854167625 +0000 UTC m=+897.263582149" watchObservedRunningTime="2026-04-22 19:38:02.85591952 +0000 UTC m=+897.265334033" Apr 22 19:38:03.839778 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:03.839747 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:38:03.840168 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:03.839842 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:03.840862 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:03.840830 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:38:04.842569 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:04.842530 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:04.843007 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:04.842879 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:38:06.096568 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:06.096542 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:38:06.098187 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:06.098165 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:38:14.843066 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:14.843027 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:14.843526 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:14.843503 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:38:24.842919 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:24.842878 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:24.843342 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:24.843322 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:38:34.842886 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:34.842838 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:34.843339 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:34.843313 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:38:44.843395 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:44.843350 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:44.843871 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:44.843848 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:38:54.842565 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:54.842511 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:38:54.843052 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:38:54.843025 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:04.843856 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:04.843822 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:39:04.844356 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:04.843992 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:39:13.852564 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:13.852535 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc"] Apr 22 19:39:13.853082 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:13.852927 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" containerID="cri-o://1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14" gracePeriod=30 Apr 22 19:39:13.853142 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:13.853059 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" containerID="cri-o://71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff" gracePeriod=30 Apr 22 19:39:13.945250 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:13.945227 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf"] Apr 22 19:39:13.948635 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:13.948617 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:13.960392 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:13.960369 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf"] Apr 22 19:39:14.081438 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.081413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36e6b14-dfde-4fc2-8bbb-49906587b4b9-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf\" (UID: \"c36e6b14-dfde-4fc2-8bbb-49906587b4b9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:14.182162 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.182140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36e6b14-dfde-4fc2-8bbb-49906587b4b9-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf\" (UID: \"c36e6b14-dfde-4fc2-8bbb-49906587b4b9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:14.182483 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.182464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36e6b14-dfde-4fc2-8bbb-49906587b4b9-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf\" (UID: \"c36e6b14-dfde-4fc2-8bbb-49906587b4b9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:14.258873 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.258851 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:14.376033 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.375978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf"] Apr 22 19:39:14.378361 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:39:14.378329 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc36e6b14_dfde_4fc2_8bbb_49906587b4b9.slice/crio-cd0e4c98fc834d99be955a697895feaebf497c4517b521f938a5d0deb284d2e7 WatchSource:0}: Error finding container cd0e4c98fc834d99be955a697895feaebf497c4517b521f938a5d0deb284d2e7: Status 404 returned error can't find the container with id cd0e4c98fc834d99be955a697895feaebf497c4517b521f938a5d0deb284d2e7 Apr 22 19:39:14.842565 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.842520 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:39:14.842927 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:14.842897 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:15.053704 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:15.053650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerStarted","Data":"12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884"} Apr 22 19:39:15.053704 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:15.053704 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerStarted","Data":"cd0e4c98fc834d99be955a697895feaebf497c4517b521f938a5d0deb284d2e7"} Apr 22 19:39:18.062462 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:18.062432 2572 generic.go:358] "Generic (PLEG): container finished" podID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerID="1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14" exitCode=0 Apr 22 19:39:18.062789 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:18.062510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerDied","Data":"1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14"} Apr 22 19:39:18.063787 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:18.063770 2572 generic.go:358] "Generic (PLEG): container finished" podID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerID="12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884" exitCode=0 Apr 22 19:39:18.063880 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:18.063832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerDied","Data":"12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884"} Apr 22 19:39:19.068705 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:19.068653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerStarted","Data":"55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6"} Apr 22 19:39:19.069070 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:19.068716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerStarted","Data":"71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853"} Apr 22 19:39:19.069070 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:19.069008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:19.070293 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:19.070270 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:39:19.086921 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:19.086873 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podStartSLOduration=6.086861687 podStartE2EDuration="6.086861687s" podCreationTimestamp="2026-04-22 19:39:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:39:19.086148648 +0000 UTC m=+973.495563186" watchObservedRunningTime="2026-04-22 19:39:19.086861687 +0000 UTC m=+973.496276199" Apr 22 19:39:20.072217 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:20.072186 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:39:20.072657 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:20.072311 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:39:20.073199 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:20.073176 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:21.074632 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:21.074598 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:39:21.075120 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:21.074993 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:24.843221 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:24.843177 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:39:24.843735 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:24.843489 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:31.074775 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:31.074644 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:39:31.075226 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:31.075186 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:34.843189 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:34.843141 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 19:39:34.843710 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:34.843316 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:39:34.843710 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:34.843513 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:34.843710 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:34.843647 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:39:41.074957 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:41.074905 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:39:41.075432 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:41.075407 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:43.985491 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:43.985462 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:39:44.084361 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.084323 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1819e413-32fa-4b67-afe5-ead6cb7993ba-kserve-provision-location\") pod \"1819e413-32fa-4b67-afe5-ead6cb7993ba\" (UID: \"1819e413-32fa-4b67-afe5-ead6cb7993ba\") " Apr 22 19:39:44.084656 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.084630 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1819e413-32fa-4b67-afe5-ead6cb7993ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1819e413-32fa-4b67-afe5-ead6cb7993ba" (UID: "1819e413-32fa-4b67-afe5-ead6cb7993ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:44.144687 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.144613 2572 generic.go:358] "Generic (PLEG): container finished" podID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerID="71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff" exitCode=0 Apr 22 19:39:44.144785 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.144705 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" Apr 22 19:39:44.144785 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.144700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerDied","Data":"71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff"} Apr 22 19:39:44.144862 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.144807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc" event={"ID":"1819e413-32fa-4b67-afe5-ead6cb7993ba","Type":"ContainerDied","Data":"13265e75887a4bceafa44401404bb37c3ccca467acca0bcdad30bf7bdf6297f8"} Apr 22 19:39:44.144862 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.144826 2572 scope.go:117] "RemoveContainer" containerID="71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff" Apr 22 19:39:44.152789 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.152773 2572 scope.go:117] "RemoveContainer" containerID="1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14" Apr 22 19:39:44.159551 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.159535 2572 scope.go:117] "RemoveContainer" containerID="359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f" Apr 22 19:39:44.165837 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.165815 2572 scope.go:117] "RemoveContainer" containerID="71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff" Apr 22 19:39:44.168839 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:39:44.168807 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff\": container with ID starting with 71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff not found: ID does not exist" containerID="71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff" Apr 22 19:39:44.168941 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.168848 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff"} err="failed to get container status \"71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff\": rpc error: code = NotFound desc = could not find container \"71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff\": container with ID starting with 71be0ba6de593ec0442320cbccfca3699fffa63f2a4f683738268a912ab0c4ff not found: ID does not exist" Apr 22 19:39:44.168941 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.168891 2572 scope.go:117] "RemoveContainer" containerID="1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14" Apr 22 19:39:44.169828 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:39:44.169804 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14\": container with ID starting with 1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14 not found: ID does not exist" containerID="1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14" Apr 22 19:39:44.169929 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.169836 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14"} err="failed to get container status \"1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14\": rpc error: code = NotFound desc = could not find container \"1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14\": container with ID starting with 1a806e188521a99d60b8b89bcf63cb78ff6e885cd61cbc978544f1d943717e14 not found: ID does not exist" Apr 22 19:39:44.169929 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.169859 2572 scope.go:117] "RemoveContainer" containerID="359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f" Apr 22 19:39:44.170496 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:39:44.170458 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f\": container with ID starting with 359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f not found: ID does not exist" containerID="359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f" Apr 22 19:39:44.170870 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.170495 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f"} err="failed to get container status \"359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f\": rpc error: code = NotFound desc = could not find container \"359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f\": container with ID starting with 359aa1f2dfda6f775efa269eba944f01d8110bafa71506950e76ef9db17ac17f not found: ID does not exist" Apr 22 19:39:44.171950 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.171931 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc"] Apr 22 19:39:44.172239 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.172222 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-86bf666f98-bz2gc"] Apr 22 19:39:44.184918 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:44.184903 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1819e413-32fa-4b67-afe5-ead6cb7993ba-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:39:46.169783 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:46.169754 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" path="/var/lib/kubelet/pods/1819e413-32fa-4b67-afe5-ead6cb7993ba/volumes" Apr 22 19:39:51.075469 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:51.075425 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:39:51.075947 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:39:51.075922 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:01.074656 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:01.074608 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:40:01.075092 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:01.074965 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:11.074769 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:11.074723 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:40:11.075194 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:11.075171 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:21.075897 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:21.075850 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:40:21.076455 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:21.076270 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:40:29.029050 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:29.029022 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf"] Apr 22 19:40:29.029588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:29.029412 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" containerID="cri-o://71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853" gracePeriod=30 Apr 22 19:40:29.029588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:29.029516 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" containerID="cri-o://55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6" gracePeriod=30 Apr 22 19:40:31.075212 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:31.075158 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:40:31.077076 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:31.077048 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:33.282846 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:33.282814 2572 generic.go:358] "Generic (PLEG): container finished" podID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerID="71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853" exitCode=0 Apr 22 19:40:33.283205 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:33.282889 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerDied","Data":"71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853"} Apr 22 19:40:39.109334 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109302 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt"] Apr 22 19:40:39.109719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109600 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" Apr 22 19:40:39.109719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109611 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" Apr 22 19:40:39.109719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109621 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="storage-initializer" Apr 22 19:40:39.109719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109628 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="storage-initializer" Apr 22 19:40:39.109719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109640 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" Apr 22 19:40:39.109719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109645 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" Apr 22 19:40:39.109908 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109739 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="kserve-container" Apr 22 19:40:39.109908 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.109750 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1819e413-32fa-4b67-afe5-ead6cb7993ba" containerName="agent" Apr 22 19:40:39.112852 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.112836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:39.121053 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.121030 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt"] Apr 22 19:40:39.253264 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.253240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/697adafa-f59c-4e34-99df-bca39fbedd1b-kserve-provision-location\") pod \"isvc-logger-predictor-55d575567c-zl5mt\" (UID: \"697adafa-f59c-4e34-99df-bca39fbedd1b\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:39.354559 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.354532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/697adafa-f59c-4e34-99df-bca39fbedd1b-kserve-provision-location\") pod \"isvc-logger-predictor-55d575567c-zl5mt\" (UID: \"697adafa-f59c-4e34-99df-bca39fbedd1b\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:39.354903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.354886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/697adafa-f59c-4e34-99df-bca39fbedd1b-kserve-provision-location\") pod \"isvc-logger-predictor-55d575567c-zl5mt\" (UID: \"697adafa-f59c-4e34-99df-bca39fbedd1b\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:39.423078 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.423061 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:39.537822 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:39.537799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt"] Apr 22 19:40:39.539848 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:40:39.539810 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697adafa_f59c_4e34_99df_bca39fbedd1b.slice/crio-8be1435a62f9c74952f77a4909b1e9514eb206debcd113d665982ae6c71d42f9 WatchSource:0}: Error finding container 8be1435a62f9c74952f77a4909b1e9514eb206debcd113d665982ae6c71d42f9: Status 404 returned error can't find the container with id 8be1435a62f9c74952f77a4909b1e9514eb206debcd113d665982ae6c71d42f9 Apr 22 19:40:40.305652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:40.305615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerStarted","Data":"72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27"} Apr 22 19:40:40.306056 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:40.305654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerStarted","Data":"8be1435a62f9c74952f77a4909b1e9514eb206debcd113d665982ae6c71d42f9"} Apr 22 19:40:41.075173 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:41.075127 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:40:41.076739 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:41.076710 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:44.318947 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:44.318912 2572 generic.go:358] "Generic (PLEG): container finished" podID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerID="72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27" exitCode=0 Apr 22 19:40:44.319302 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:44.318986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerDied","Data":"72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27"} Apr 22 19:40:45.323732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.323700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerStarted","Data":"6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678"} Apr 22 19:40:45.323732 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.323736 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerStarted","Data":"86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2"} Apr 22 19:40:45.324197 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.324003 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:45.324197 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.324025 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:40:45.325429 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.325388 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:40:45.326046 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.326019 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:45.340518 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:45.340433 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podStartSLOduration=6.340415958 podStartE2EDuration="6.340415958s" podCreationTimestamp="2026-04-22 19:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:40:45.339324979 +0000 UTC m=+1059.748739503" watchObservedRunningTime="2026-04-22 19:40:45.340415958 +0000 UTC m=+1059.749830472" Apr 22 19:40:46.326950 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:46.326916 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:40:46.327395 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:46.327171 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:51.075228 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:51.075189 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:5000: connect: connection refused" Apr 22 19:40:51.075593 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:51.075300 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:40:51.076848 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:51.076816 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:51.077004 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:51.076913 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:40:56.327697 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:56.327573 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:40:56.328098 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:56.328071 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:59.159057 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.159036 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:40:59.286622 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.286557 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36e6b14-dfde-4fc2-8bbb-49906587b4b9-kserve-provision-location\") pod \"c36e6b14-dfde-4fc2-8bbb-49906587b4b9\" (UID: \"c36e6b14-dfde-4fc2-8bbb-49906587b4b9\") " Apr 22 19:40:59.286905 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.286880 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36e6b14-dfde-4fc2-8bbb-49906587b4b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c36e6b14-dfde-4fc2-8bbb-49906587b4b9" (UID: "c36e6b14-dfde-4fc2-8bbb-49906587b4b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:40:59.363595 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.363567 2572 generic.go:358] "Generic (PLEG): container finished" podID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerID="55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6" exitCode=0 Apr 22 19:40:59.363699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.363632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerDied","Data":"55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6"} Apr 22 19:40:59.363699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.363657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" event={"ID":"c36e6b14-dfde-4fc2-8bbb-49906587b4b9","Type":"ContainerDied","Data":"cd0e4c98fc834d99be955a697895feaebf497c4517b521f938a5d0deb284d2e7"} Apr 22 19:40:59.363699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.363636 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf" Apr 22 19:40:59.363699 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.363686 2572 scope.go:117] "RemoveContainer" containerID="55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6" Apr 22 19:40:59.371441 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.371422 2572 scope.go:117] "RemoveContainer" containerID="71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853" Apr 22 19:40:59.383390 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.383373 2572 scope.go:117] "RemoveContainer" containerID="12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884" Apr 22 19:40:59.386628 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.386602 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf"] Apr 22 19:40:59.387639 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.387620 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36e6b14-dfde-4fc2-8bbb-49906587b4b9-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:40:59.389880 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.389859 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-677f896499-vlfqf"] Apr 22 19:40:59.390564 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.390550 2572 scope.go:117] "RemoveContainer" containerID="55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6" Apr 22 19:40:59.390850 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:40:59.390830 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6\": container with ID starting with 55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6 not found: ID does not exist" containerID="55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6" Apr 22 19:40:59.390942 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.390853 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6"} err="failed to get container status \"55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6\": rpc error: code = NotFound desc = could not find container \"55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6\": container with ID starting with 55cfaa0d5e093e2dd2539e190c66405d5f87b8ac4fe353934799267be4cd09f6 not found: ID does not exist" Apr 22 19:40:59.390942 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.390869 2572 scope.go:117] "RemoveContainer" containerID="71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853" Apr 22 19:40:59.391117 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:40:59.391101 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853\": container with ID starting with 71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853 not found: ID does not exist" containerID="71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853" Apr 22 19:40:59.391153 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.391123 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853"} err="failed to get container status \"71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853\": rpc error: code = NotFound desc = could not find container \"71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853\": container with ID starting with 71897f5014c8d45096a652f0dd45faa022806e6e44990a833320b733d8b7c853 not found: ID does not exist" Apr 22 19:40:59.391153 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.391139 2572 scope.go:117] "RemoveContainer" containerID="12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884" Apr 22 19:40:59.391358 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:40:59.391340 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884\": container with ID starting with 12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884 not found: ID does not exist" containerID="12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884" Apr 22 19:40:59.391411 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:40:59.391362 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884"} err="failed to get container status \"12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884\": rpc error: code = NotFound desc = could not find container \"12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884\": container with ID starting with 12e230e831e6dc3f4f37fcb2e7061f54580060d4df1dac00bb40a682fa1e2884 not found: ID does not exist" Apr 22 19:41:00.170907 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:00.170877 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" path="/var/lib/kubelet/pods/c36e6b14-dfde-4fc2-8bbb-49906587b4b9/volumes" Apr 22 19:41:06.326999 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:06.326961 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:41:06.327484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:06.327457 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:41:16.326915 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:16.326869 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:41:16.327362 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:16.327285 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:41:26.326970 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:26.326927 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:41:26.327495 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:26.327470 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:41:36.327399 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:36.327353 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:41:36.327885 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:36.327758 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:41:46.327795 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:46.327767 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:41:46.328168 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:46.327821 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:41:54.361314 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.361282 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt"] Apr 22 19:41:54.361774 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.361578 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" containerID="cri-o://86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2" gracePeriod=30 Apr 22 19:41:54.361774 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.361631 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" containerID="cri-o://6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678" gracePeriod=30 Apr 22 19:41:54.435277 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435253 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4"] Apr 22 19:41:54.435526 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435515 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" Apr 22 19:41:54.435526 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435527 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" Apr 22 19:41:54.435612 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435544 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="storage-initializer" Apr 22 19:41:54.435612 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435549 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="storage-initializer" Apr 22 19:41:54.435612 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435556 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" Apr 22 19:41:54.435612 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435561 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" Apr 22 19:41:54.435612 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435606 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="agent" Apr 22 19:41:54.435803 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.435615 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c36e6b14-dfde-4fc2-8bbb-49906587b4b9" containerName="kserve-container" Apr 22 19:41:54.438557 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.438542 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:41:54.461272 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.461252 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4"] Apr 22 19:41:54.558986 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.558958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-45gm4\" (UID: \"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:41:54.660286 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.660256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-45gm4\" (UID: \"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:41:54.660569 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.660555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-45gm4\" (UID: \"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:41:54.748003 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.747977 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:41:54.860952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.860914 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4"] Apr 22 19:41:54.863837 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:41:54.863812 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ba6c0ad_5e45_49d9_a11e_62373a8cd29d.slice/crio-838d65f1b979f1694c192a63102c0ed691b7d8fdb28f544bfdefe97dd78e2047 WatchSource:0}: Error finding container 838d65f1b979f1694c192a63102c0ed691b7d8fdb28f544bfdefe97dd78e2047: Status 404 returned error can't find the container with id 838d65f1b979f1694c192a63102c0ed691b7d8fdb28f544bfdefe97dd78e2047 Apr 22 19:41:54.868049 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:54.868033 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:41:55.514478 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:55.514445 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" event={"ID":"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d","Type":"ContainerStarted","Data":"bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8"} Apr 22 19:41:55.514478 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:55.514482 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" event={"ID":"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d","Type":"ContainerStarted","Data":"838d65f1b979f1694c192a63102c0ed691b7d8fdb28f544bfdefe97dd78e2047"} Apr 22 19:41:56.327599 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:56.327554 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:41:56.327937 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:56.327914 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:41:58.524948 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:58.524918 2572 generic.go:358] "Generic (PLEG): container finished" podID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerID="86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2" exitCode=0 Apr 22 19:41:58.525267 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:58.524960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerDied","Data":"86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2"} Apr 22 19:41:59.529201 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:59.529163 2572 generic.go:358] "Generic (PLEG): container finished" podID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerID="bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8" exitCode=0 Apr 22 19:41:59.529692 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:41:59.529239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" event={"ID":"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d","Type":"ContainerDied","Data":"bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8"} Apr 22 19:42:06.327521 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:06.327442 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:42:06.327925 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:06.327777 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:06.552219 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:06.552189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" event={"ID":"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d","Type":"ContainerStarted","Data":"a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57"} Apr 22 19:42:06.552490 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:06.552473 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:42:06.553786 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:06.553764 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:42:06.570071 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:06.570032 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podStartSLOduration=6.14673296 podStartE2EDuration="12.570020746s" podCreationTimestamp="2026-04-22 19:41:54 +0000 UTC" firstStartedPulling="2026-04-22 19:41:59.530579392 +0000 UTC m=+1133.939993884" lastFinishedPulling="2026-04-22 19:42:05.953867176 +0000 UTC m=+1140.363281670" observedRunningTime="2026-04-22 19:42:06.56873632 +0000 UTC m=+1140.978150854" watchObservedRunningTime="2026-04-22 19:42:06.570020746 +0000 UTC m=+1140.979435258" Apr 22 19:42:07.555112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:07.555074 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:42:16.327394 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:16.327350 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 22 19:42:16.327846 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:16.327501 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:42:16.327846 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:16.327753 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:16.327925 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:16.327860 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:42:17.555754 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:17.555709 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:42:24.496127 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.496107 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:42:24.572963 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.572940 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/697adafa-f59c-4e34-99df-bca39fbedd1b-kserve-provision-location\") pod \"697adafa-f59c-4e34-99df-bca39fbedd1b\" (UID: \"697adafa-f59c-4e34-99df-bca39fbedd1b\") " Apr 22 19:42:24.573260 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.573238 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/697adafa-f59c-4e34-99df-bca39fbedd1b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "697adafa-f59c-4e34-99df-bca39fbedd1b" (UID: "697adafa-f59c-4e34-99df-bca39fbedd1b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:24.610783 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.610752 2572 generic.go:358] "Generic (PLEG): container finished" podID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerID="6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678" exitCode=137 Apr 22 19:42:24.610898 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.610824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerDied","Data":"6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678"} Apr 22 19:42:24.610898 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.610853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" event={"ID":"697adafa-f59c-4e34-99df-bca39fbedd1b","Type":"ContainerDied","Data":"8be1435a62f9c74952f77a4909b1e9514eb206debcd113d665982ae6c71d42f9"} Apr 22 19:42:24.610898 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.610853 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt" Apr 22 19:42:24.610898 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.610866 2572 scope.go:117] "RemoveContainer" containerID="6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678" Apr 22 19:42:24.619372 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.619356 2572 scope.go:117] "RemoveContainer" containerID="86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2" Apr 22 19:42:24.625782 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.625767 2572 scope.go:117] "RemoveContainer" containerID="72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27" Apr 22 19:42:24.633266 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.633251 2572 scope.go:117] "RemoveContainer" containerID="6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678" Apr 22 19:42:24.633577 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:42:24.633560 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678\": container with ID starting with 6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678 not found: ID does not exist" containerID="6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678" Apr 22 19:42:24.633635 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.633582 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678"} err="failed to get container status \"6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678\": rpc error: code = NotFound desc = could not find container \"6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678\": container with ID starting with 6a3159e572f342515a1ff0b9b29c241ecadf991bb56649d7d4dce1543fee2678 not found: ID does not exist" Apr 22 19:42:24.633635 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.633598 2572 scope.go:117] "RemoveContainer" containerID="86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2" Apr 22 19:42:24.633867 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:42:24.633851 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2\": container with ID starting with 86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2 not found: ID does not exist" containerID="86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2" Apr 22 19:42:24.633915 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.633870 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2"} err="failed to get container status \"86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2\": rpc error: code = NotFound desc = could not find container \"86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2\": container with ID starting with 86630a8fe981e5dce1be361578fd52abfde27fb3a398ce06172cb44f51d169f2 not found: ID does not exist" Apr 22 19:42:24.633915 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.633883 2572 scope.go:117] "RemoveContainer" containerID="72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27" Apr 22 19:42:24.634080 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.634064 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt"] Apr 22 19:42:24.634121 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:42:24.634065 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27\": container with ID starting with 72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27 not found: ID does not exist" containerID="72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27" Apr 22 19:42:24.634121 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.634101 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27"} err="failed to get container status \"72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27\": rpc error: code = NotFound desc = could not find container \"72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27\": container with ID starting with 72d3e4dd2718ed9fd9b45a77c5e6fc917c9e1baf36a1d8fc2ca00920490fcc27 not found: ID does not exist" Apr 22 19:42:24.637090 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.637070 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-55d575567c-zl5mt"] Apr 22 19:42:24.673388 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:24.673370 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/697adafa-f59c-4e34-99df-bca39fbedd1b-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:42:26.169739 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:26.169649 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" path="/var/lib/kubelet/pods/697adafa-f59c-4e34-99df-bca39fbedd1b/volumes" Apr 22 19:42:27.556103 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:27.556062 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:42:37.555848 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:37.555807 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:42:47.555681 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:47.555621 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:42:57.555656 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:42:57.555613 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:43:06.118870 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:06.118840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:43:06.121115 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:06.121095 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:43:07.556020 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:07.555975 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:43:17.556853 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:17.556820 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:43:24.466516 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.466486 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4"] Apr 22 19:43:24.466923 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.466784 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" containerID="cri-o://a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57" gracePeriod=30 Apr 22 19:43:24.572905 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.572878 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv"] Apr 22 19:43:24.573171 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573160 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" Apr 22 19:43:24.573213 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573172 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" Apr 22 19:43:24.573213 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573181 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" Apr 22 19:43:24.573213 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573187 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" Apr 22 19:43:24.573213 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573198 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="storage-initializer" Apr 22 19:43:24.573213 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573203 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="storage-initializer" Apr 22 19:43:24.573357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573244 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="agent" Apr 22 19:43:24.573357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.573253 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="697adafa-f59c-4e34-99df-bca39fbedd1b" containerName="kserve-container" Apr 22 19:43:24.577330 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.577311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:43:24.584770 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.584746 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv"] Apr 22 19:43:24.678841 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.678816 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e6db887-3cc0-435a-9c32-41d8f00f3ad8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv\" (UID: \"3e6db887-3cc0-435a-9c32-41d8f00f3ad8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:43:24.779149 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.779085 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e6db887-3cc0-435a-9c32-41d8f00f3ad8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv\" (UID: \"3e6db887-3cc0-435a-9c32-41d8f00f3ad8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:43:24.779401 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.779385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e6db887-3cc0-435a-9c32-41d8f00f3ad8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv\" (UID: \"3e6db887-3cc0-435a-9c32-41d8f00f3ad8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:43:24.888253 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:24.888232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:43:25.001216 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:25.001195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv"] Apr 22 19:43:25.003710 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:43:25.003685 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e6db887_3cc0_435a_9c32_41d8f00f3ad8.slice/crio-f812999a8928e21605d82307251663ba6e28d87cc95e0ed9d2c1b9374e88fec8 WatchSource:0}: Error finding container f812999a8928e21605d82307251663ba6e28d87cc95e0ed9d2c1b9374e88fec8: Status 404 returned error can't find the container with id f812999a8928e21605d82307251663ba6e28d87cc95e0ed9d2c1b9374e88fec8 Apr 22 19:43:25.784651 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:25.784611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" event={"ID":"3e6db887-3cc0-435a-9c32-41d8f00f3ad8","Type":"ContainerStarted","Data":"94d0fdfeaf2bd6290b8780b84661831db10c889eaa22bee1a4ba06b54c57bb99"} Apr 22 19:43:25.784651 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:25.784650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" event={"ID":"3e6db887-3cc0-435a-9c32-41d8f00f3ad8","Type":"ContainerStarted","Data":"f812999a8928e21605d82307251663ba6e28d87cc95e0ed9d2c1b9374e88fec8"} Apr 22 19:43:27.555556 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:27.555516 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 19:43:28.302817 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.302797 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:43:28.405182 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.405155 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d-kserve-provision-location\") pod \"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d\" (UID: \"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d\") " Apr 22 19:43:28.405555 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.405532 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" (UID: "2ba6c0ad-5e45-49d9-a11e-62373a8cd29d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:28.506352 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.506329 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:43:28.793375 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.793343 2572 generic.go:358] "Generic (PLEG): container finished" podID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerID="a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57" exitCode=0 Apr 22 19:43:28.793784 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.793436 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" Apr 22 19:43:28.793784 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.793439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" event={"ID":"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d","Type":"ContainerDied","Data":"a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57"} Apr 22 19:43:28.793784 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.793479 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4" event={"ID":"2ba6c0ad-5e45-49d9-a11e-62373a8cd29d","Type":"ContainerDied","Data":"838d65f1b979f1694c192a63102c0ed691b7d8fdb28f544bfdefe97dd78e2047"} Apr 22 19:43:28.793784 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.793495 2572 scope.go:117] "RemoveContainer" containerID="a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57" Apr 22 19:43:28.794968 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.794941 2572 generic.go:358] "Generic (PLEG): container finished" podID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerID="94d0fdfeaf2bd6290b8780b84661831db10c889eaa22bee1a4ba06b54c57bb99" exitCode=0 Apr 22 19:43:28.795081 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.794983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" event={"ID":"3e6db887-3cc0-435a-9c32-41d8f00f3ad8","Type":"ContainerDied","Data":"94d0fdfeaf2bd6290b8780b84661831db10c889eaa22bee1a4ba06b54c57bb99"} Apr 22 19:43:28.801515 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.801475 2572 scope.go:117] "RemoveContainer" containerID="bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8" Apr 22 19:43:28.808282 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.808263 2572 scope.go:117] "RemoveContainer" containerID="a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57" Apr 22 19:43:28.808491 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:43:28.808474 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57\": container with ID starting with a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57 not found: ID does not exist" containerID="a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57" Apr 22 19:43:28.808562 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.808497 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57"} err="failed to get container status \"a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57\": rpc error: code = NotFound desc = could not find container \"a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57\": container with ID starting with a84b0a33d0cae0cc80183a996105fd5bea226e552360fd20a94ac3b990955d57 not found: ID does not exist" Apr 22 19:43:28.808562 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.808512 2572 scope.go:117] "RemoveContainer" containerID="bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8" Apr 22 19:43:28.808771 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:43:28.808753 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8\": container with ID starting with bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8 not found: ID does not exist" containerID="bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8" Apr 22 19:43:28.808840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.808776 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8"} err="failed to get container status \"bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8\": rpc error: code = NotFound desc = could not find container \"bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8\": container with ID starting with bba19fed439f8b2deaa0e930f1dd4a5b6ba87aa88748048a892e39744ee843b8 not found: ID does not exist" Apr 22 19:43:28.824048 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.824028 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4"] Apr 22 19:43:28.827384 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:28.827365 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-45gm4"] Apr 22 19:43:29.799923 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:29.799881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" event={"ID":"3e6db887-3cc0-435a-9c32-41d8f00f3ad8","Type":"ContainerStarted","Data":"57f9ebdda0b17c549cda4eb7259aec4fefa8af0e2b13e7c1d158179925250283"} Apr 22 19:43:29.800386 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:29.800233 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:43:29.801476 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:29.801449 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:43:29.821024 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:29.820984 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podStartSLOduration=5.820969939 podStartE2EDuration="5.820969939s" podCreationTimestamp="2026-04-22 19:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:29.819430552 +0000 UTC m=+1224.228845064" watchObservedRunningTime="2026-04-22 19:43:29.820969939 +0000 UTC m=+1224.230384453" Apr 22 19:43:30.169727 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:30.169696 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" path="/var/lib/kubelet/pods/2ba6c0ad-5e45-49d9-a11e-62373a8cd29d/volumes" Apr 22 19:43:30.803075 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:30.803035 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:43:40.803447 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:40.803406 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:43:50.803781 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:43:50.803738 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:44:00.803931 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:00.803887 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:44:10.803202 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:10.803160 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:44:20.803444 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:20.803400 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:44:30.803843 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:30.803806 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 19:44:40.804559 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:40.804529 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:44:45.211312 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.211278 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt"] Apr 22 19:44:45.211652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.211551 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="storage-initializer" Apr 22 19:44:45.211652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.211561 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="storage-initializer" Apr 22 19:44:45.211652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.211570 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" Apr 22 19:44:45.211652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.211576 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" Apr 22 19:44:45.211652 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.211624 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ba6c0ad-5e45-49d9-a11e-62373a8cd29d" containerName="kserve-container" Apr 22 19:44:45.214611 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.214596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:44:45.239863 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.239842 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt"] Apr 22 19:44:45.307221 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.307195 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv"] Apr 22 19:44:45.307438 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.307419 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" containerID="cri-o://57f9ebdda0b17c549cda4eb7259aec4fefa8af0e2b13e7c1d158179925250283" gracePeriod=30 Apr 22 19:44:45.325207 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.325182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263a1473-4b82-4490-ac28-264def880027-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt\" (UID: \"263a1473-4b82-4490-ac28-264def880027\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:44:45.426086 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.426061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263a1473-4b82-4490-ac28-264def880027-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt\" (UID: \"263a1473-4b82-4490-ac28-264def880027\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:44:45.426394 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.426378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263a1473-4b82-4490-ac28-264def880027-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt\" (UID: \"263a1473-4b82-4490-ac28-264def880027\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:44:45.524050 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.523998 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:44:45.637567 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:45.637523 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt"] Apr 22 19:44:45.641092 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:44:45.641066 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263a1473_4b82_4490_ac28_264def880027.slice/crio-c4d3cfde2b0a2f8793bcee2baf2f0ee2b5fc7c21ba91e35429ebf7e223013a07 WatchSource:0}: Error finding container c4d3cfde2b0a2f8793bcee2baf2f0ee2b5fc7c21ba91e35429ebf7e223013a07: Status 404 returned error can't find the container with id c4d3cfde2b0a2f8793bcee2baf2f0ee2b5fc7c21ba91e35429ebf7e223013a07 Apr 22 19:44:46.014032 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:46.014000 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" event={"ID":"263a1473-4b82-4490-ac28-264def880027","Type":"ContainerStarted","Data":"ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd"} Apr 22 19:44:46.014032 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:46.014032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" event={"ID":"263a1473-4b82-4490-ac28-264def880027","Type":"ContainerStarted","Data":"c4d3cfde2b0a2f8793bcee2baf2f0ee2b5fc7c21ba91e35429ebf7e223013a07"} Apr 22 19:44:49.024779 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:49.024747 2572 generic.go:358] "Generic (PLEG): container finished" podID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerID="57f9ebdda0b17c549cda4eb7259aec4fefa8af0e2b13e7c1d158179925250283" exitCode=0 Apr 22 19:44:49.025097 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:49.024800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" event={"ID":"3e6db887-3cc0-435a-9c32-41d8f00f3ad8","Type":"ContainerDied","Data":"57f9ebdda0b17c549cda4eb7259aec4fefa8af0e2b13e7c1d158179925250283"} Apr 22 19:44:49.041752 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:49.041734 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:44:49.153576 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:49.153556 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e6db887-3cc0-435a-9c32-41d8f00f3ad8-kserve-provision-location\") pod \"3e6db887-3cc0-435a-9c32-41d8f00f3ad8\" (UID: \"3e6db887-3cc0-435a-9c32-41d8f00f3ad8\") " Apr 22 19:44:49.153885 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:49.153864 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6db887-3cc0-435a-9c32-41d8f00f3ad8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3e6db887-3cc0-435a-9c32-41d8f00f3ad8" (UID: "3e6db887-3cc0-435a-9c32-41d8f00f3ad8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:49.255027 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:49.255006 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e6db887-3cc0-435a-9c32-41d8f00f3ad8-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:44:50.029773 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.029728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" event={"ID":"3e6db887-3cc0-435a-9c32-41d8f00f3ad8","Type":"ContainerDied","Data":"f812999a8928e21605d82307251663ba6e28d87cc95e0ed9d2c1b9374e88fec8"} Apr 22 19:44:50.029773 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.029768 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv" Apr 22 19:44:50.030280 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.029793 2572 scope.go:117] "RemoveContainer" containerID="57f9ebdda0b17c549cda4eb7259aec4fefa8af0e2b13e7c1d158179925250283" Apr 22 19:44:50.031574 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.031545 2572 generic.go:358] "Generic (PLEG): container finished" podID="263a1473-4b82-4490-ac28-264def880027" containerID="ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd" exitCode=0 Apr 22 19:44:50.031718 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.031622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" event={"ID":"263a1473-4b82-4490-ac28-264def880027","Type":"ContainerDied","Data":"ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd"} Apr 22 19:44:50.039248 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.038784 2572 scope.go:117] "RemoveContainer" containerID="94d0fdfeaf2bd6290b8780b84661831db10c889eaa22bee1a4ba06b54c57bb99" Apr 22 19:44:50.064045 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.064025 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv"] Apr 22 19:44:50.065570 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.065545 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-s6gcv"] Apr 22 19:44:50.169344 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:44:50.169304 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" path="/var/lib/kubelet/pods/3e6db887-3cc0-435a-9c32-41d8f00f3ad8/volumes" Apr 22 19:47:12.486952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:12.486912 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" event={"ID":"263a1473-4b82-4490-ac28-264def880027","Type":"ContainerStarted","Data":"dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c"} Apr 22 19:47:12.487387 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:12.487047 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:47:12.516060 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:12.516014 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" podStartSLOduration=5.992644507 podStartE2EDuration="2m27.516002327s" podCreationTimestamp="2026-04-22 19:44:45 +0000 UTC" firstStartedPulling="2026-04-22 19:44:50.032922369 +0000 UTC m=+1304.442336873" lastFinishedPulling="2026-04-22 19:47:11.556280198 +0000 UTC m=+1445.965694693" observedRunningTime="2026-04-22 19:47:12.513847521 +0000 UTC m=+1446.923262034" watchObservedRunningTime="2026-04-22 19:47:12.516002327 +0000 UTC m=+1446.925416840" Apr 22 19:47:43.494367 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:43.494332 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:47:45.619762 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.619731 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt"] Apr 22 19:47:45.620142 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.619959 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" podUID="263a1473-4b82-4490-ac28-264def880027" containerName="kserve-container" containerID="cri-o://dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c" gracePeriod=30 Apr 22 19:47:45.638136 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.638104 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww"] Apr 22 19:47:45.638447 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.638431 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="storage-initializer" Apr 22 19:47:45.638524 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.638450 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="storage-initializer" Apr 22 19:47:45.638524 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.638475 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" Apr 22 19:47:45.638524 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.638486 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" Apr 22 19:47:45.638691 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.638575 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e6db887-3cc0-435a-9c32-41d8f00f3ad8" containerName="kserve-container" Apr 22 19:47:45.666070 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.666037 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww"] Apr 22 19:47:45.666212 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.666183 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:47:45.814723 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.814692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww\" (UID: \"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:47:45.915835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.915805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww\" (UID: \"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:47:45.916193 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.916174 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww\" (UID: \"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:47:45.977321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:45.977297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:47:46.100819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.100791 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww"] Apr 22 19:47:46.103436 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:47:46.103404 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-76256efec981ac4ea2475832f237bb8147227ea0b7d6c0f8dd248ff16a43167f WatchSource:0}: Error finding container 76256efec981ac4ea2475832f237bb8147227ea0b7d6c0f8dd248ff16a43167f: Status 404 returned error can't find the container with id 76256efec981ac4ea2475832f237bb8147227ea0b7d6c0f8dd248ff16a43167f Apr 22 19:47:46.105276 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.105260 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:47:46.569710 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.569689 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:47:46.588288 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.588263 2572 generic.go:358] "Generic (PLEG): container finished" podID="263a1473-4b82-4490-ac28-264def880027" containerID="dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c" exitCode=0 Apr 22 19:47:46.588399 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.588329 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" Apr 22 19:47:46.588465 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.588334 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" event={"ID":"263a1473-4b82-4490-ac28-264def880027","Type":"ContainerDied","Data":"dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c"} Apr 22 19:47:46.588465 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.588429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt" event={"ID":"263a1473-4b82-4490-ac28-264def880027","Type":"ContainerDied","Data":"c4d3cfde2b0a2f8793bcee2baf2f0ee2b5fc7c21ba91e35429ebf7e223013a07"} Apr 22 19:47:46.588465 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.588448 2572 scope.go:117] "RemoveContainer" containerID="dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c" Apr 22 19:47:46.589741 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.589718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" event={"ID":"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6","Type":"ContainerStarted","Data":"32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646"} Apr 22 19:47:46.589825 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.589751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" event={"ID":"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6","Type":"ContainerStarted","Data":"76256efec981ac4ea2475832f237bb8147227ea0b7d6c0f8dd248ff16a43167f"} Apr 22 19:47:46.595812 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.595783 2572 scope.go:117] "RemoveContainer" containerID="ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd" Apr 22 19:47:46.603118 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.603098 2572 scope.go:117] "RemoveContainer" containerID="dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c" Apr 22 19:47:46.603360 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:47:46.603333 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c\": container with ID starting with dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c not found: ID does not exist" containerID="dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c" Apr 22 19:47:46.603433 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.603369 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c"} err="failed to get container status \"dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c\": rpc error: code = NotFound desc = could not find container \"dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c\": container with ID starting with dccb440c05c13dbd6e33fc2d4c392b9245a81a20fd2cd89ba4ee6c4c02ab447c not found: ID does not exist" Apr 22 19:47:46.603433 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.603391 2572 scope.go:117] "RemoveContainer" containerID="ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd" Apr 22 19:47:46.603610 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:47:46.603591 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd\": container with ID starting with ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd not found: ID does not exist" containerID="ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd" Apr 22 19:47:46.603650 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.603615 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd"} err="failed to get container status \"ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd\": rpc error: code = NotFound desc = could not find container \"ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd\": container with ID starting with ab27c5db9b752b0067d3bb9b2cde12000ffda736a338eb34531f478ac1c6f9bd not found: ID does not exist" Apr 22 19:47:46.721649 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.721581 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263a1473-4b82-4490-ac28-264def880027-kserve-provision-location\") pod \"263a1473-4b82-4490-ac28-264def880027\" (UID: \"263a1473-4b82-4490-ac28-264def880027\") " Apr 22 19:47:46.722059 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.721967 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263a1473-4b82-4490-ac28-264def880027-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "263a1473-4b82-4490-ac28-264def880027" (UID: "263a1473-4b82-4490-ac28-264def880027"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:46.823034 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.823006 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263a1473-4b82-4490-ac28-264def880027-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:47:46.911549 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.911525 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt"] Apr 22 19:47:46.916447 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:46.916426 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-j8ngt"] Apr 22 19:47:48.169460 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:48.169416 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263a1473-4b82-4490-ac28-264def880027" path="/var/lib/kubelet/pods/263a1473-4b82-4490-ac28-264def880027/volumes" Apr 22 19:47:50.603370 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:50.603339 2572 generic.go:358] "Generic (PLEG): container finished" podID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerID="32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646" exitCode=0 Apr 22 19:47:50.603755 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:50.603388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" event={"ID":"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6","Type":"ContainerDied","Data":"32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646"} Apr 22 19:47:51.607679 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:51.607637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" event={"ID":"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6","Type":"ContainerStarted","Data":"b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c"} Apr 22 19:47:51.608085 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:51.608009 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:47:51.609287 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:51.609262 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 22 19:47:51.628157 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:51.628107 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" podStartSLOduration=6.6280901740000004 podStartE2EDuration="6.628090174s" podCreationTimestamp="2026-04-22 19:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:47:51.626717677 +0000 UTC m=+1486.036132190" watchObservedRunningTime="2026-04-22 19:47:51.628090174 +0000 UTC m=+1486.037504689" Apr 22 19:47:52.610681 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:47:52.610634 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 22 19:48:02.611854 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:02.611817 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:48:05.619979 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.619949 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww"] Apr 22 19:48:05.620370 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.620192 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="kserve-container" containerID="cri-o://b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c" gracePeriod=30 Apr 22 19:48:05.673490 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.673457 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj"] Apr 22 19:48:05.673827 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.673810 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263a1473-4b82-4490-ac28-264def880027" containerName="kserve-container" Apr 22 19:48:05.673912 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.673829 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263a1473-4b82-4490-ac28-264def880027" containerName="kserve-container" Apr 22 19:48:05.673912 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.673860 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263a1473-4b82-4490-ac28-264def880027" containerName="storage-initializer" Apr 22 19:48:05.673912 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.673868 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263a1473-4b82-4490-ac28-264def880027" containerName="storage-initializer" Apr 22 19:48:05.674062 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.673948 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="263a1473-4b82-4490-ac28-264def880027" containerName="kserve-container" Apr 22 19:48:05.676853 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.676834 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:05.687052 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.687030 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj"] Apr 22 19:48:05.752768 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.752747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bae01c-a196-4c94-a404-baa58f9655ec-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj\" (UID: \"c6bae01c-a196-4c94-a404-baa58f9655ec\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:05.853322 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.853292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bae01c-a196-4c94-a404-baa58f9655ec-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj\" (UID: \"c6bae01c-a196-4c94-a404-baa58f9655ec\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:05.853692 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.853656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bae01c-a196-4c94-a404-baa58f9655ec-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj\" (UID: \"c6bae01c-a196-4c94-a404-baa58f9655ec\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:05.986558 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:05.986511 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:06.117437 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.117401 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj"] Apr 22 19:48:06.129585 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:48:06.129559 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bae01c_a196_4c94_a404_baa58f9655ec.slice/crio-b5ba63c7dd26a0d5952a1928896419c1a1dd3006c4a88fba1f65c8b517c89c2d WatchSource:0}: Error finding container b5ba63c7dd26a0d5952a1928896419c1a1dd3006c4a88fba1f65c8b517c89c2d: Status 404 returned error can't find the container with id b5ba63c7dd26a0d5952a1928896419c1a1dd3006c4a88fba1f65c8b517c89c2d Apr 22 19:48:06.141502 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.141480 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:48:06.144293 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.144271 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:48:06.257747 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.257728 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:48:06.357197 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.357171 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6-kserve-provision-location\") pod \"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6\" (UID: \"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6\") " Apr 22 19:48:06.357484 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.357463 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" (UID: "e0ed4b0d-aa77-46f4-99a3-ede0a54102a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:06.457639 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.457613 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:48:06.650601 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.650569 2572 generic.go:358] "Generic (PLEG): container finished" podID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerID="b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c" exitCode=0 Apr 22 19:48:06.651081 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.650652 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" Apr 22 19:48:06.651081 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.650649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" event={"ID":"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6","Type":"ContainerDied","Data":"b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c"} Apr 22 19:48:06.651081 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.650788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww" event={"ID":"e0ed4b0d-aa77-46f4-99a3-ede0a54102a6","Type":"ContainerDied","Data":"76256efec981ac4ea2475832f237bb8147227ea0b7d6c0f8dd248ff16a43167f"} Apr 22 19:48:06.651081 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.650820 2572 scope.go:117] "RemoveContainer" containerID="b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c" Apr 22 19:48:06.652347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.652312 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" event={"ID":"c6bae01c-a196-4c94-a404-baa58f9655ec","Type":"ContainerStarted","Data":"e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200"} Apr 22 19:48:06.652347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.652341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" event={"ID":"c6bae01c-a196-4c94-a404-baa58f9655ec","Type":"ContainerStarted","Data":"b5ba63c7dd26a0d5952a1928896419c1a1dd3006c4a88fba1f65c8b517c89c2d"} Apr 22 19:48:06.660143 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.660120 2572 scope.go:117] "RemoveContainer" containerID="32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646" Apr 22 19:48:06.671332 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.671310 2572 scope.go:117] "RemoveContainer" containerID="b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c" Apr 22 19:48:06.671569 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:06.671548 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c\": container with ID starting with b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c not found: ID does not exist" containerID="b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c" Apr 22 19:48:06.671655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.671574 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c"} err="failed to get container status \"b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c\": rpc error: code = NotFound desc = could not find container \"b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c\": container with ID starting with b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c not found: ID does not exist" Apr 22 19:48:06.671655 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.671591 2572 scope.go:117] "RemoveContainer" containerID="32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646" Apr 22 19:48:06.671857 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:06.671838 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646\": container with ID starting with 32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646 not found: ID does not exist" containerID="32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646" Apr 22 19:48:06.671914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.671861 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646"} err="failed to get container status \"32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646\": rpc error: code = NotFound desc = could not find container \"32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646\": container with ID starting with 32fe70a5faa633b09df3d1c7ef2c019f4bc2bfabd3a46556d876aaf22cfef646 not found: ID does not exist" Apr 22 19:48:06.689498 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.689450 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww"] Apr 22 19:48:06.690857 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:06.690836 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-xvfww"] Apr 22 19:48:07.093289 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:07.093260 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:07.093413 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:07.093267 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:08.170163 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:08.170131 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" path="/var/lib/kubelet/pods/e0ed4b0d-aa77-46f4-99a3-ede0a54102a6/volumes" Apr 22 19:48:09.986300 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:09.986274 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:10.667648 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:10.667617 2572 generic.go:358] "Generic (PLEG): container finished" podID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerID="e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200" exitCode=0 Apr 22 19:48:10.667839 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:10.667697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" event={"ID":"c6bae01c-a196-4c94-a404-baa58f9655ec","Type":"ContainerDied","Data":"e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200"} Apr 22 19:48:11.673477 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:11.673444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" event={"ID":"c6bae01c-a196-4c94-a404-baa58f9655ec","Type":"ContainerStarted","Data":"33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19"} Apr 22 19:48:11.673840 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:11.673657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:11.693399 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:11.693355 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" podStartSLOduration=6.693342131 podStartE2EDuration="6.693342131s" podCreationTimestamp="2026-04-22 19:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:11.691797959 +0000 UTC m=+1506.101212483" watchObservedRunningTime="2026-04-22 19:48:11.693342131 +0000 UTC m=+1506.102756644" Apr 22 19:48:20.021751 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:20.021717 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:20.456267 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:20.456235 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:30.055985 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:30.055911 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:35.499797 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:35.497852 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:40.063908 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:40.063870 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:42.682342 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:42.682304 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:45.853276 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.853241 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj"] Apr 22 19:48:45.853718 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.853530 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerName="kserve-container" containerID="cri-o://33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19" gracePeriod=30 Apr 22 19:48:45.865813 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.865787 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v"] Apr 22 19:48:45.866216 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.866201 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="kserve-container" Apr 22 19:48:45.866266 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.866220 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="kserve-container" Apr 22 19:48:45.866266 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.866252 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="storage-initializer" Apr 22 19:48:45.866266 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.866261 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="storage-initializer" Apr 22 19:48:45.866373 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.866357 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0ed4b0d-aa77-46f4-99a3-ede0a54102a6" containerName="kserve-container" Apr 22 19:48:45.870045 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.870023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:48:45.883744 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.883707 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v"] Apr 22 19:48:45.924637 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:45.924601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6b99f5466f-5w48v\" (UID: \"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:48:46.025626 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.025592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6b99f5466f-5w48v\" (UID: \"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:48:46.025959 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.025943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6b99f5466f-5w48v\" (UID: \"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:48:46.182247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.182211 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:48:46.308525 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.308500 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v"] Apr 22 19:48:46.310646 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:48:46.310618 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0bb04b3_0d9e_43f4_943d_2cfa1f5c77da.slice/crio-d7c99e6d472fb788c864bace487510e9893ac8fef9a3cc14c8cccda8e86db7d7 WatchSource:0}: Error finding container d7c99e6d472fb788c864bace487510e9893ac8fef9a3cc14c8cccda8e86db7d7: Status 404 returned error can't find the container with id d7c99e6d472fb788c864bace487510e9893ac8fef9a3cc14c8cccda8e86db7d7 Apr 22 19:48:46.782119 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.782089 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerStarted","Data":"30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55"} Apr 22 19:48:46.782247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.782129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerStarted","Data":"d7c99e6d472fb788c864bace487510e9893ac8fef9a3cc14c8cccda8e86db7d7"} Apr 22 19:48:46.970414 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:46.970391 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:47.034420 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.034364 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bae01c-a196-4c94-a404-baa58f9655ec-kserve-provision-location\") pod \"c6bae01c-a196-4c94-a404-baa58f9655ec\" (UID: \"c6bae01c-a196-4c94-a404-baa58f9655ec\") " Apr 22 19:48:47.034693 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.034649 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bae01c-a196-4c94-a404-baa58f9655ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6bae01c-a196-4c94-a404-baa58f9655ec" (UID: "c6bae01c-a196-4c94-a404-baa58f9655ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:47.134869 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.134845 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bae01c-a196-4c94-a404-baa58f9655ec-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:48:47.786267 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.786235 2572 generic.go:358] "Generic (PLEG): container finished" podID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerID="33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19" exitCode=0 Apr 22 19:48:47.786443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.786306 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" Apr 22 19:48:47.786443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.786325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" event={"ID":"c6bae01c-a196-4c94-a404-baa58f9655ec","Type":"ContainerDied","Data":"33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19"} Apr 22 19:48:47.786443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.786366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj" event={"ID":"c6bae01c-a196-4c94-a404-baa58f9655ec","Type":"ContainerDied","Data":"b5ba63c7dd26a0d5952a1928896419c1a1dd3006c4a88fba1f65c8b517c89c2d"} Apr 22 19:48:47.786443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.786381 2572 scope.go:117] "RemoveContainer" containerID="33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19" Apr 22 19:48:47.794438 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.794419 2572 scope.go:117] "RemoveContainer" containerID="e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200" Apr 22 19:48:47.801035 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.801018 2572 scope.go:117] "RemoveContainer" containerID="33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19" Apr 22 19:48:47.801249 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:47.801233 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19\": container with ID starting with 33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19 not found: ID does not exist" containerID="33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19" Apr 22 19:48:47.801296 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.801256 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19"} err="failed to get container status \"33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19\": rpc error: code = NotFound desc = could not find container \"33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19\": container with ID starting with 33bb817860dd77d72c6f45dd8d2ebf2e0d55e55d1305cd7a21d980d76d316e19 not found: ID does not exist" Apr 22 19:48:47.801296 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.801271 2572 scope.go:117] "RemoveContainer" containerID="e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200" Apr 22 19:48:47.801449 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:47.801434 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200\": container with ID starting with e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200 not found: ID does not exist" containerID="e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200" Apr 22 19:48:47.801489 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.801453 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200"} err="failed to get container status \"e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200\": rpc error: code = NotFound desc = could not find container \"e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200\": container with ID starting with e2e0af30c874eb33152bce4c8a1a87df3f72027b4fe436b63e258db5bd8b6200 not found: ID does not exist" Apr 22 19:48:47.809980 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.809960 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj"] Apr 22 19:48:47.811871 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:47.811850 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9rkvj"] Apr 22 19:48:48.169197 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:48.169164 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" path="/var/lib/kubelet/pods/c6bae01c-a196-4c94-a404-baa58f9655ec/volumes" Apr 22 19:48:50.098551 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:50.098523 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:50.465415 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:48:50.465376 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:48:50.797581 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:50.797504 2572 generic.go:358] "Generic (PLEG): container finished" podID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerID="30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55" exitCode=0 Apr 22 19:48:50.797722 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:50.797577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerDied","Data":"30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55"} Apr 22 19:48:51.803861 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:51.803825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerStarted","Data":"1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90"} Apr 22 19:48:54.814758 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:54.814724 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerStarted","Data":"583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf"} Apr 22 19:48:54.815112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:54.814868 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:48:54.836002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:54.835948 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" podStartSLOduration=6.72755801 podStartE2EDuration="9.835935141s" podCreationTimestamp="2026-04-22 19:48:45 +0000 UTC" firstStartedPulling="2026-04-22 19:48:50.857178293 +0000 UTC m=+1545.266592783" lastFinishedPulling="2026-04-22 19:48:53.965555422 +0000 UTC m=+1548.374969914" observedRunningTime="2026-04-22 19:48:54.83439462 +0000 UTC m=+1549.243809136" watchObservedRunningTime="2026-04-22 19:48:54.835935141 +0000 UTC m=+1549.245349653" Apr 22 19:48:55.817952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:48:55.817923 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:49:00.133787 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:49:00.133756 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:49:05.507405 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:49:05.507370 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed4b0d_aa77_46f4_99a3_ede0a54102a6.slice/crio-b479301fb28ffd2d506280ef6a87783c49c3fe665d8e780f93bac1532d7d232c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:49:26.822469 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:49:26.822436 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:49:56.824105 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:49:56.824033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:50:05.937400 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:05.937368 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v"] Apr 22 19:50:05.937818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:05.937651 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" containerID="cri-o://1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90" gracePeriod=30 Apr 22 19:50:05.937818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:05.937718 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-agent" containerID="cri-o://583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf" gracePeriod=30 Apr 22 19:50:06.082538 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.082508 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt"] Apr 22 19:50:06.082823 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.082810 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerName="storage-initializer" Apr 22 19:50:06.082874 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.082824 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerName="storage-initializer" Apr 22 19:50:06.082874 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.082838 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerName="kserve-container" Apr 22 19:50:06.082874 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.082844 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerName="kserve-container" Apr 22 19:50:06.083002 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.082899 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6bae01c-a196-4c94-a404-baa58f9655ec" containerName="kserve-container" Apr 22 19:50:06.085934 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.085915 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:50:06.101263 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.101242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt"] Apr 22 19:50:06.166952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.166924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-f5ldt\" (UID: \"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:50:06.268075 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.268020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-f5ldt\" (UID: \"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:50:06.268326 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.268311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-f5ldt\" (UID: \"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:50:06.394993 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.394973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:50:06.522049 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.519689 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt"] Apr 22 19:50:06.525610 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:50:06.525584 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddecfdab5_4f4f_4b63_af99_6bdbe3ec1da4.slice/crio-071e931f8a36647c4ab333d8f17d66005aebd6dd5c2f19076bd72ecd15ca3ec3 WatchSource:0}: Error finding container 071e931f8a36647c4ab333d8f17d66005aebd6dd5c2f19076bd72ecd15ca3ec3: Status 404 returned error can't find the container with id 071e931f8a36647c4ab333d8f17d66005aebd6dd5c2f19076bd72ecd15ca3ec3 Apr 22 19:50:06.821319 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:06.821237 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:50:07.027719 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:07.027656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" event={"ID":"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4","Type":"ContainerStarted","Data":"02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677"} Apr 22 19:50:07.028115 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:07.027727 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" event={"ID":"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4","Type":"ContainerStarted","Data":"071e931f8a36647c4ab333d8f17d66005aebd6dd5c2f19076bd72ecd15ca3ec3"} Apr 22 19:50:08.032029 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:08.031970 2572 generic.go:358] "Generic (PLEG): container finished" podID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerID="1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90" exitCode=0 Apr 22 19:50:08.032293 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:08.032038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerDied","Data":"1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90"} Apr 22 19:50:12.045397 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:12.045363 2572 generic.go:358] "Generic (PLEG): container finished" podID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerID="02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677" exitCode=0 Apr 22 19:50:12.045791 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:12.045447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" event={"ID":"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4","Type":"ContainerDied","Data":"02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677"} Apr 22 19:50:16.821058 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:16.821005 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:50:25.087634 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:25.087601 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" event={"ID":"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4","Type":"ContainerStarted","Data":"e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3"} Apr 22 19:50:25.088021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:25.087984 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:50:25.088988 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:25.088963 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:50:25.108066 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:25.108026 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podStartSLOduration=6.892259279 podStartE2EDuration="19.108015406s" podCreationTimestamp="2026-04-22 19:50:06 +0000 UTC" firstStartedPulling="2026-04-22 19:50:12.046618304 +0000 UTC m=+1626.456032795" lastFinishedPulling="2026-04-22 19:50:24.262374416 +0000 UTC m=+1638.671788922" observedRunningTime="2026-04-22 19:50:25.105968537 +0000 UTC m=+1639.515383049" watchObservedRunningTime="2026-04-22 19:50:25.108015406 +0000 UTC m=+1639.517429918" Apr 22 19:50:26.090747 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:26.090710 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:50:26.820623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:26.820582 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:50:26.820808 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:26.820756 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:50:36.073456 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.073431 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:50:36.090849 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.090823 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:50:36.120824 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.120799 2572 generic.go:358] "Generic (PLEG): container finished" podID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerID="583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf" exitCode=0 Apr 22 19:50:36.120951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.120872 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" Apr 22 19:50:36.120951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.120879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerDied","Data":"583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf"} Apr 22 19:50:36.120951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.120914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v" event={"ID":"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da","Type":"ContainerDied","Data":"d7c99e6d472fb788c864bace487510e9893ac8fef9a3cc14c8cccda8e86db7d7"} Apr 22 19:50:36.120951 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.120935 2572 scope.go:117] "RemoveContainer" containerID="583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf" Apr 22 19:50:36.129861 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.129842 2572 scope.go:117] "RemoveContainer" containerID="1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90" Apr 22 19:50:36.137717 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.137697 2572 scope.go:117] "RemoveContainer" containerID="30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55" Apr 22 19:50:36.144042 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.144011 2572 scope.go:117] "RemoveContainer" containerID="583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf" Apr 22 19:50:36.144285 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:50:36.144265 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf\": container with ID starting with 583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf not found: ID does not exist" containerID="583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf" Apr 22 19:50:36.144357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.144294 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf"} err="failed to get container status \"583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf\": rpc error: code = NotFound desc = could not find container \"583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf\": container with ID starting with 583a927c9286cbcd6799cac830ba8754730cd9e8ba1fdac75c0bb48177d3f9bf not found: ID does not exist" Apr 22 19:50:36.144357 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.144312 2572 scope.go:117] "RemoveContainer" containerID="1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90" Apr 22 19:50:36.144550 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:50:36.144533 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90\": container with ID starting with 1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90 not found: ID does not exist" containerID="1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90" Apr 22 19:50:36.144615 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.144553 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90"} err="failed to get container status \"1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90\": rpc error: code = NotFound desc = could not find container \"1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90\": container with ID starting with 1710cf043a139cf23f8240dce4dc4a924cf8b2293579dc59f10b86c3d58b2b90 not found: ID does not exist" Apr 22 19:50:36.144615 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.144567 2572 scope.go:117] "RemoveContainer" containerID="30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55" Apr 22 19:50:36.144801 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:50:36.144775 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55\": container with ID starting with 30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55 not found: ID does not exist" containerID="30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55" Apr 22 19:50:36.144869 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.144804 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55"} err="failed to get container status \"30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55\": rpc error: code = NotFound desc = could not find container \"30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55\": container with ID starting with 30ac0859ed2478e2080f2f154ed3a35e01b7cf136948a4800a7e396abd041a55 not found: ID does not exist" Apr 22 19:50:36.186278 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.186258 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da-kserve-provision-location\") pod \"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da\" (UID: \"d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da\") " Apr 22 19:50:36.186514 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.186492 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" (UID: "d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:36.292568 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.288146 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:50:36.451649 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.451620 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v"] Apr 22 19:50:36.455831 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:36.455811 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b99f5466f-5w48v"] Apr 22 19:50:38.169532 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:38.169499 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" path="/var/lib/kubelet/pods/d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da/volumes" Apr 22 19:50:46.090914 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:46.090872 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:50:56.090716 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:50:56.090640 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:51:06.092737 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:06.092709 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:51:17.507549 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.507515 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt"] Apr 22 19:51:17.508127 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.507847 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" containerID="cri-o://e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3" gracePeriod=30 Apr 22 19:51:17.581711 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.581681 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t"] Apr 22 19:51:17.581976 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.581965 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-agent" Apr 22 19:51:17.582021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.581979 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-agent" Apr 22 19:51:17.582021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.581990 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" Apr 22 19:51:17.582021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.581996 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" Apr 22 19:51:17.582021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.582007 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="storage-initializer" Apr 22 19:51:17.582021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.582012 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="storage-initializer" Apr 22 19:51:17.582173 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.582059 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-container" Apr 22 19:51:17.582173 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.582070 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0bb04b3-0d9e-43f4-943d-2cfa1f5c77da" containerName="kserve-agent" Apr 22 19:51:17.584891 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.584876 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:51:17.595708 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.595684 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t"] Apr 22 19:51:17.666062 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.666037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12613fdd-0aca-4502-887a-6641c3b0fb23-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-7p84t\" (UID: \"12613fdd-0aca-4502-887a-6641c3b0fb23\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:51:17.767213 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.767140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12613fdd-0aca-4502-887a-6641c3b0fb23-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-7p84t\" (UID: \"12613fdd-0aca-4502-887a-6641c3b0fb23\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:51:17.767511 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.767492 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12613fdd-0aca-4502-887a-6641c3b0fb23-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-7p84t\" (UID: \"12613fdd-0aca-4502-887a-6641c3b0fb23\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:51:17.894616 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:17.894590 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:51:18.008146 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:18.008121 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t"] Apr 22 19:51:18.009973 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:51:18.009946 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12613fdd_0aca_4502_887a_6641c3b0fb23.slice/crio-f1bbec6f69c32dc5c4e520937ecd91c9392d2367dcfbdb0375b9ccb0aa98f07a WatchSource:0}: Error finding container f1bbec6f69c32dc5c4e520937ecd91c9392d2367dcfbdb0375b9ccb0aa98f07a: Status 404 returned error can't find the container with id f1bbec6f69c32dc5c4e520937ecd91c9392d2367dcfbdb0375b9ccb0aa98f07a Apr 22 19:51:18.241247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:18.241214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" event={"ID":"12613fdd-0aca-4502-887a-6641c3b0fb23","Type":"ContainerStarted","Data":"9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3"} Apr 22 19:51:18.241247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:18.241252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" event={"ID":"12613fdd-0aca-4502-887a-6641c3b0fb23","Type":"ContainerStarted","Data":"f1bbec6f69c32dc5c4e520937ecd91c9392d2367dcfbdb0375b9ccb0aa98f07a"} Apr 22 19:51:20.037237 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.037212 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:51:20.187639 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.187618 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4-kserve-provision-location\") pod \"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4\" (UID: \"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4\") " Apr 22 19:51:20.197108 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.197087 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" (UID: "decfdab5-4f4f-4b63-af99-6bdbe3ec1da4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:51:20.247739 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.247711 2572 generic.go:358] "Generic (PLEG): container finished" podID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerID="e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3" exitCode=0 Apr 22 19:51:20.247819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.247774 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" Apr 22 19:51:20.247819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.247780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" event={"ID":"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4","Type":"ContainerDied","Data":"e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3"} Apr 22 19:51:20.247819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.247803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt" event={"ID":"decfdab5-4f4f-4b63-af99-6bdbe3ec1da4","Type":"ContainerDied","Data":"071e931f8a36647c4ab333d8f17d66005aebd6dd5c2f19076bd72ecd15ca3ec3"} Apr 22 19:51:20.247819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.247818 2572 scope.go:117] "RemoveContainer" containerID="e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3" Apr 22 19:51:20.255988 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.255966 2572 scope.go:117] "RemoveContainer" containerID="02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677" Apr 22 19:51:20.264696 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.264680 2572 scope.go:117] "RemoveContainer" containerID="e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3" Apr 22 19:51:20.264952 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:51:20.264935 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3\": container with ID starting with e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3 not found: ID does not exist" containerID="e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3" Apr 22 19:51:20.265007 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.264959 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3"} err="failed to get container status \"e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3\": rpc error: code = NotFound desc = could not find container \"e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3\": container with ID starting with e8a4599c920cea491e10c850b7e688126d9ef960e8788854cc375e115b8f9fb3 not found: ID does not exist" Apr 22 19:51:20.265007 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.264975 2572 scope.go:117] "RemoveContainer" containerID="02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677" Apr 22 19:51:20.265186 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:51:20.265168 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677\": container with ID starting with 02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677 not found: ID does not exist" containerID="02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677" Apr 22 19:51:20.265235 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.265188 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677"} err="failed to get container status \"02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677\": rpc error: code = NotFound desc = could not find container \"02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677\": container with ID starting with 02ff27918e969def74214beb8141b4483ed0b05704f09131d6bd70e24095c677 not found: ID does not exist" Apr 22 19:51:20.269123 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.269104 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt"] Apr 22 19:51:20.273741 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.273724 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-f5ldt"] Apr 22 19:51:20.288782 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:20.288762 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:51:22.170458 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:22.170421 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" path="/var/lib/kubelet/pods/decfdab5-4f4f-4b63-af99-6bdbe3ec1da4/volumes" Apr 22 19:51:23.258799 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:23.258761 2572 generic.go:358] "Generic (PLEG): container finished" podID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerID="9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3" exitCode=0 Apr 22 19:51:23.258799 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:23.258801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" event={"ID":"12613fdd-0aca-4502-887a-6641c3b0fb23","Type":"ContainerDied","Data":"9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3"} Apr 22 19:51:24.262552 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:24.262518 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" event={"ID":"12613fdd-0aca-4502-887a-6641c3b0fb23","Type":"ContainerStarted","Data":"d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504"} Apr 22 19:51:24.263024 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:24.262869 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:51:24.264150 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:24.264123 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:51:24.285530 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:24.285491 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podStartSLOduration=7.285475228 podStartE2EDuration="7.285475228s" podCreationTimestamp="2026-04-22 19:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:51:24.284971129 +0000 UTC m=+1698.694385646" watchObservedRunningTime="2026-04-22 19:51:24.285475228 +0000 UTC m=+1698.694889741" Apr 22 19:51:25.265997 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:25.265957 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:51:35.266425 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:35.266386 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:51:45.266868 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:45.266825 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:51:55.266461 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:51:55.266423 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:52:05.267894 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:05.267855 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:52:09.104901 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.104866 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t"] Apr 22 19:52:09.105289 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.105098 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" containerID="cri-o://d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504" gracePeriod=30 Apr 22 19:52:09.145233 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.145204 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf"] Apr 22 19:52:09.145550 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.145534 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="storage-initializer" Apr 22 19:52:09.145623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.145553 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="storage-initializer" Apr 22 19:52:09.145623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.145576 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" Apr 22 19:52:09.145623 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.145586 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" Apr 22 19:52:09.145797 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.145690 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="decfdab5-4f4f-4b63-af99-6bdbe3ec1da4" containerName="kserve-container" Apr 22 19:52:09.148569 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.148551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:52:09.157355 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.157335 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf"] Apr 22 19:52:09.221596 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.221572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fed1f20f-c027-41ec-8642-9beb74836c0b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf\" (UID: \"fed1f20f-c027-41ec-8642-9beb74836c0b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:52:09.322058 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.322032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fed1f20f-c027-41ec-8642-9beb74836c0b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf\" (UID: \"fed1f20f-c027-41ec-8642-9beb74836c0b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:52:09.322371 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.322354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fed1f20f-c027-41ec-8642-9beb74836c0b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf\" (UID: \"fed1f20f-c027-41ec-8642-9beb74836c0b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:52:09.458618 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.458597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:52:09.576321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:09.576299 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf"] Apr 22 19:52:09.578349 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:52:09.578314 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed1f20f_c027_41ec_8642_9beb74836c0b.slice/crio-8a5ad4270ef67fa65a5f418b1b21d9c8e1d8a9b6fe63df23f09f437e78c2ddcb WatchSource:0}: Error finding container 8a5ad4270ef67fa65a5f418b1b21d9c8e1d8a9b6fe63df23f09f437e78c2ddcb: Status 404 returned error can't find the container with id 8a5ad4270ef67fa65a5f418b1b21d9c8e1d8a9b6fe63df23f09f437e78c2ddcb Apr 22 19:52:10.389493 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:10.389460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" event={"ID":"fed1f20f-c027-41ec-8642-9beb74836c0b","Type":"ContainerStarted","Data":"1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011"} Apr 22 19:52:10.389493 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:10.389493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" event={"ID":"fed1f20f-c027-41ec-8642-9beb74836c0b","Type":"ContainerStarted","Data":"8a5ad4270ef67fa65a5f418b1b21d9c8e1d8a9b6fe63df23f09f437e78c2ddcb"} Apr 22 19:52:11.541282 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:11.541262 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:52:11.638419 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:11.638352 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12613fdd-0aca-4502-887a-6641c3b0fb23-kserve-provision-location\") pod \"12613fdd-0aca-4502-887a-6641c3b0fb23\" (UID: \"12613fdd-0aca-4502-887a-6641c3b0fb23\") " Apr 22 19:52:11.647980 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:11.647951 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12613fdd-0aca-4502-887a-6641c3b0fb23-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12613fdd-0aca-4502-887a-6641c3b0fb23" (UID: "12613fdd-0aca-4502-887a-6641c3b0fb23"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:52:11.739749 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:11.739727 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12613fdd-0aca-4502-887a-6641c3b0fb23-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:52:12.397639 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.397601 2572 generic.go:358] "Generic (PLEG): container finished" podID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerID="d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504" exitCode=0 Apr 22 19:52:12.397775 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.397698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" event={"ID":"12613fdd-0aca-4502-887a-6641c3b0fb23","Type":"ContainerDied","Data":"d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504"} Apr 22 19:52:12.397775 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.397718 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" Apr 22 19:52:12.397775 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.397743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t" event={"ID":"12613fdd-0aca-4502-887a-6641c3b0fb23","Type":"ContainerDied","Data":"f1bbec6f69c32dc5c4e520937ecd91c9392d2367dcfbdb0375b9ccb0aa98f07a"} Apr 22 19:52:12.397775 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.397762 2572 scope.go:117] "RemoveContainer" containerID="d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504" Apr 22 19:52:12.405825 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.405809 2572 scope.go:117] "RemoveContainer" containerID="9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3" Apr 22 19:52:12.412864 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.412846 2572 scope.go:117] "RemoveContainer" containerID="d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504" Apr 22 19:52:12.413164 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:52:12.413144 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504\": container with ID starting with d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504 not found: ID does not exist" containerID="d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504" Apr 22 19:52:12.413239 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.413170 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504"} err="failed to get container status \"d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504\": rpc error: code = NotFound desc = could not find container \"d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504\": container with ID starting with d82b596c0d96baa348a4ccdda30ffda6728fa034a4d522edb757d94327fd2504 not found: ID does not exist" Apr 22 19:52:12.413239 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.413187 2572 scope.go:117] "RemoveContainer" containerID="9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3" Apr 22 19:52:12.413502 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:52:12.413481 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3\": container with ID starting with 9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3 not found: ID does not exist" containerID="9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3" Apr 22 19:52:12.413574 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.413510 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3"} err="failed to get container status \"9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3\": rpc error: code = NotFound desc = could not find container \"9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3\": container with ID starting with 9ea2dc808fcb27ce64df49e31b6e90780883b63013e87af0a94f71ef11290fc3 not found: ID does not exist" Apr 22 19:52:12.414401 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.414382 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t"] Apr 22 19:52:12.417630 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:12.417610 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7p84t"] Apr 22 19:52:14.169361 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:14.169330 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" path="/var/lib/kubelet/pods/12613fdd-0aca-4502-887a-6641c3b0fb23/volumes" Apr 22 19:52:14.406738 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:14.406707 2572 generic.go:358] "Generic (PLEG): container finished" podID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerID="1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011" exitCode=0 Apr 22 19:52:14.406866 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:14.406744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" event={"ID":"fed1f20f-c027-41ec-8642-9beb74836c0b","Type":"ContainerDied","Data":"1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011"} Apr 22 19:52:15.411295 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:15.411263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" event={"ID":"fed1f20f-c027-41ec-8642-9beb74836c0b","Type":"ContainerStarted","Data":"7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6"} Apr 22 19:52:15.411751 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:15.411554 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:52:15.412887 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:15.412863 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:52:15.427311 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:15.427265 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podStartSLOduration=6.427248993 podStartE2EDuration="6.427248993s" podCreationTimestamp="2026-04-22 19:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:52:15.426239504 +0000 UTC m=+1749.835654016" watchObservedRunningTime="2026-04-22 19:52:15.427248993 +0000 UTC m=+1749.836663509" Apr 22 19:52:16.414746 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:16.414713 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:52:26.415489 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:26.415451 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:52:36.415176 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:36.415133 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:52:46.415367 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:46.415328 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:52:56.415788 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:52:56.415711 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:53:00.800517 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.800488 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf"] Apr 22 19:53:00.800999 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.800778 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" containerID="cri-o://7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6" gracePeriod=30 Apr 22 19:53:00.848620 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.848592 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r"] Apr 22 19:53:00.848902 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.848889 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="storage-initializer" Apr 22 19:53:00.848952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.848904 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="storage-initializer" Apr 22 19:53:00.848952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.848918 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" Apr 22 19:53:00.848952 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.848924 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" Apr 22 19:53:00.849048 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.848981 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="12613fdd-0aca-4502-887a-6641c3b0fb23" containerName="kserve-container" Apr 22 19:53:00.852178 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.852163 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:53:00.861311 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.861290 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r"] Apr 22 19:53:00.963139 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:00.963117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015800d3-a3f1-4763-bb27-8dd913884403-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-dj89r\" (UID: \"015800d3-a3f1-4763-bb27-8dd913884403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:53:01.064454 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.064395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015800d3-a3f1-4763-bb27-8dd913884403-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-dj89r\" (UID: \"015800d3-a3f1-4763-bb27-8dd913884403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:53:01.064802 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.064785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015800d3-a3f1-4763-bb27-8dd913884403-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-dj89r\" (UID: \"015800d3-a3f1-4763-bb27-8dd913884403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:53:01.162416 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.162370 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:53:01.281465 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.281414 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r"] Apr 22 19:53:01.286258 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:53:01.286229 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015800d3_a3f1_4763_bb27_8dd913884403.slice/crio-d93b3cfbdd75125a3800a572c252adfa90487470f18c9c5305f80a4ed1d2add0 WatchSource:0}: Error finding container d93b3cfbdd75125a3800a572c252adfa90487470f18c9c5305f80a4ed1d2add0: Status 404 returned error can't find the container with id d93b3cfbdd75125a3800a572c252adfa90487470f18c9c5305f80a4ed1d2add0 Apr 22 19:53:01.288216 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.288196 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:53:01.542720 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.542658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" event={"ID":"015800d3-a3f1-4763-bb27-8dd913884403","Type":"ContainerStarted","Data":"2253e984487ca5615c00d53f97735fe637a668b0c0b4655c9984c4ab941876f0"} Apr 22 19:53:01.542720 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:01.542719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" event={"ID":"015800d3-a3f1-4763-bb27-8dd913884403","Type":"ContainerStarted","Data":"d93b3cfbdd75125a3800a572c252adfa90487470f18c9c5305f80a4ed1d2add0"} Apr 22 19:53:03.239380 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.239359 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:53:03.384083 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.384021 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fed1f20f-c027-41ec-8642-9beb74836c0b-kserve-provision-location\") pod \"fed1f20f-c027-41ec-8642-9beb74836c0b\" (UID: \"fed1f20f-c027-41ec-8642-9beb74836c0b\") " Apr 22 19:53:03.392960 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.392933 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed1f20f-c027-41ec-8642-9beb74836c0b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fed1f20f-c027-41ec-8642-9beb74836c0b" (UID: "fed1f20f-c027-41ec-8642-9beb74836c0b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:03.485097 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.485073 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fed1f20f-c027-41ec-8642-9beb74836c0b-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:53:03.550895 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.550871 2572 generic.go:358] "Generic (PLEG): container finished" podID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerID="7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6" exitCode=0 Apr 22 19:53:03.551030 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.550924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" event={"ID":"fed1f20f-c027-41ec-8642-9beb74836c0b","Type":"ContainerDied","Data":"7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6"} Apr 22 19:53:03.551030 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.550942 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" Apr 22 19:53:03.551030 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.550957 2572 scope.go:117] "RemoveContainer" containerID="7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6" Apr 22 19:53:03.551148 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.550946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf" event={"ID":"fed1f20f-c027-41ec-8642-9beb74836c0b","Type":"ContainerDied","Data":"8a5ad4270ef67fa65a5f418b1b21d9c8e1d8a9b6fe63df23f09f437e78c2ddcb"} Apr 22 19:53:03.558748 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.558729 2572 scope.go:117] "RemoveContainer" containerID="1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011" Apr 22 19:53:03.565718 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.565696 2572 scope.go:117] "RemoveContainer" containerID="7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6" Apr 22 19:53:03.565955 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:53:03.565934 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6\": container with ID starting with 7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6 not found: ID does not exist" containerID="7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6" Apr 22 19:53:03.566014 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.565964 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6"} err="failed to get container status \"7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6\": rpc error: code = NotFound desc = could not find container \"7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6\": container with ID starting with 7f9198c61f6f8d59bd00d98d06e6eab833c33fc65dd02c3bc94026be252171a6 not found: ID does not exist" Apr 22 19:53:03.566014 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.565983 2572 scope.go:117] "RemoveContainer" containerID="1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011" Apr 22 19:53:03.566188 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:53:03.566174 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011\": container with ID starting with 1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011 not found: ID does not exist" containerID="1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011" Apr 22 19:53:03.566227 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.566192 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011"} err="failed to get container status \"1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011\": rpc error: code = NotFound desc = could not find container \"1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011\": container with ID starting with 1dc9d8862b649e909a68a1ba3e727246e2061352a7286cfc2c59b54af7e01011 not found: ID does not exist" Apr 22 19:53:03.571772 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.571752 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf"] Apr 22 19:53:03.574356 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:03.574338 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-4fpwf"] Apr 22 19:53:04.169400 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:04.169370 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" path="/var/lib/kubelet/pods/fed1f20f-c027-41ec-8642-9beb74836c0b/volumes" Apr 22 19:53:05.559468 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:05.559434 2572 generic.go:358] "Generic (PLEG): container finished" podID="015800d3-a3f1-4763-bb27-8dd913884403" containerID="2253e984487ca5615c00d53f97735fe637a668b0c0b4655c9984c4ab941876f0" exitCode=0 Apr 22 19:53:05.559859 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:05.559512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" event={"ID":"015800d3-a3f1-4763-bb27-8dd913884403","Type":"ContainerDied","Data":"2253e984487ca5615c00d53f97735fe637a668b0c0b4655c9984c4ab941876f0"} Apr 22 19:53:06.197727 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:06.197701 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:53:06.211311 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:06.211230 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:53:12.590103 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:12.590038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" event={"ID":"015800d3-a3f1-4763-bb27-8dd913884403","Type":"ContainerStarted","Data":"1106fe82f44ad7f93ccae0fcafb1f2381ff3032b44fbedc9e4b7676d361c7db0"} Apr 22 19:53:12.590418 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:12.590334 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:53:12.591642 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:12.591618 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:53:12.606428 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:12.606389 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podStartSLOduration=5.881958751 podStartE2EDuration="12.606377752s" podCreationTimestamp="2026-04-22 19:53:00 +0000 UTC" firstStartedPulling="2026-04-22 19:53:05.560641124 +0000 UTC m=+1799.970055618" lastFinishedPulling="2026-04-22 19:53:12.285060128 +0000 UTC m=+1806.694474619" observedRunningTime="2026-04-22 19:53:12.604958109 +0000 UTC m=+1807.014372621" watchObservedRunningTime="2026-04-22 19:53:12.606377752 +0000 UTC m=+1807.015792264" Apr 22 19:53:13.593035 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:13.592994 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:53:23.593287 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:23.593244 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:53:33.593193 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:33.593150 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:53:43.593171 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:43.593125 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:53:53.593644 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:53:53.593602 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:54:03.593923 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:03.593877 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:54:13.593813 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:13.593774 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:54:23.594579 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:23.594547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:54:31.810800 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.810718 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r"] Apr 22 19:54:31.811236 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.810983 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" containerID="cri-o://1106fe82f44ad7f93ccae0fcafb1f2381ff3032b44fbedc9e4b7676d361c7db0" gracePeriod=30 Apr 22 19:54:31.900145 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.900114 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm"] Apr 22 19:54:31.900456 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.900441 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="storage-initializer" Apr 22 19:54:31.900534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.900459 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="storage-initializer" Apr 22 19:54:31.900534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.900476 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" Apr 22 19:54:31.900534 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.900484 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" Apr 22 19:54:31.900712 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.900588 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fed1f20f-c027-41ec-8642-9beb74836c0b" containerName="kserve-container" Apr 22 19:54:31.906906 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.906872 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:54:31.911406 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.911357 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm"] Apr 22 19:54:31.994804 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:31.994774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac5edabb-d258-4679-814d-9355f7ecdd50-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wjntm\" (UID: \"ac5edabb-d258-4679-814d-9355f7ecdd50\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:54:32.095191 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:32.095136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac5edabb-d258-4679-814d-9355f7ecdd50-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wjntm\" (UID: \"ac5edabb-d258-4679-814d-9355f7ecdd50\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:54:32.095456 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:32.095440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac5edabb-d258-4679-814d-9355f7ecdd50-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wjntm\" (UID: \"ac5edabb-d258-4679-814d-9355f7ecdd50\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:54:32.217949 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:32.217926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:54:32.330153 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:32.330130 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm"] Apr 22 19:54:32.332439 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:54:32.332404 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5edabb_d258_4679_814d_9355f7ecdd50.slice/crio-6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2 WatchSource:0}: Error finding container 6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2: Status 404 returned error can't find the container with id 6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2 Apr 22 19:54:32.813185 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:32.813148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" event={"ID":"ac5edabb-d258-4679-814d-9355f7ecdd50","Type":"ContainerStarted","Data":"7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b"} Apr 22 19:54:32.813185 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:32.813183 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" event={"ID":"ac5edabb-d258-4679-814d-9355f7ecdd50","Type":"ContainerStarted","Data":"6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2"} Apr 22 19:54:33.593907 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:33.593865 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:54:34.821975 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:34.821945 2572 generic.go:358] "Generic (PLEG): container finished" podID="015800d3-a3f1-4763-bb27-8dd913884403" containerID="1106fe82f44ad7f93ccae0fcafb1f2381ff3032b44fbedc9e4b7676d361c7db0" exitCode=0 Apr 22 19:54:34.822291 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:34.822015 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" event={"ID":"015800d3-a3f1-4763-bb27-8dd913884403","Type":"ContainerDied","Data":"1106fe82f44ad7f93ccae0fcafb1f2381ff3032b44fbedc9e4b7676d361c7db0"} Apr 22 19:54:34.847567 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:34.847550 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:54:34.913547 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:34.913514 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015800d3-a3f1-4763-bb27-8dd913884403-kserve-provision-location\") pod \"015800d3-a3f1-4763-bb27-8dd913884403\" (UID: \"015800d3-a3f1-4763-bb27-8dd913884403\") " Apr 22 19:54:34.913836 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:34.913816 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015800d3-a3f1-4763-bb27-8dd913884403-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "015800d3-a3f1-4763-bb27-8dd913884403" (UID: "015800d3-a3f1-4763-bb27-8dd913884403"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:54:35.014464 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.014442 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015800d3-a3f1-4763-bb27-8dd913884403-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:54:35.826639 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.826608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" event={"ID":"015800d3-a3f1-4763-bb27-8dd913884403","Type":"ContainerDied","Data":"d93b3cfbdd75125a3800a572c252adfa90487470f18c9c5305f80a4ed1d2add0"} Apr 22 19:54:35.827055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.826615 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r" Apr 22 19:54:35.827055 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.826653 2572 scope.go:117] "RemoveContainer" containerID="1106fe82f44ad7f93ccae0fcafb1f2381ff3032b44fbedc9e4b7676d361c7db0" Apr 22 19:54:35.835064 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.834941 2572 scope.go:117] "RemoveContainer" containerID="2253e984487ca5615c00d53f97735fe637a668b0c0b4655c9984c4ab941876f0" Apr 22 19:54:35.848172 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.848131 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r"] Apr 22 19:54:35.850748 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:35.850727 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-dj89r"] Apr 22 19:54:36.169311 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:36.169277 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015800d3-a3f1-4763-bb27-8dd913884403" path="/var/lib/kubelet/pods/015800d3-a3f1-4763-bb27-8dd913884403/volumes" Apr 22 19:54:36.830941 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:36.830907 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerID="7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b" exitCode=0 Apr 22 19:54:36.831406 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:36.830983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" event={"ID":"ac5edabb-d258-4679-814d-9355f7ecdd50","Type":"ContainerDied","Data":"7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b"} Apr 22 19:54:37.837535 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:37.837497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" event={"ID":"ac5edabb-d258-4679-814d-9355f7ecdd50","Type":"ContainerStarted","Data":"f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6"} Apr 22 19:54:37.837998 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:37.837905 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:54:37.839109 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:37.839081 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:54:37.855694 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:37.855635 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podStartSLOduration=6.855621296 podStartE2EDuration="6.855621296s" podCreationTimestamp="2026-04-22 19:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:54:37.854130951 +0000 UTC m=+1892.263545464" watchObservedRunningTime="2026-04-22 19:54:37.855621296 +0000 UTC m=+1892.265035808" Apr 22 19:54:38.840697 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:38.840641 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:54:48.841412 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:48.841359 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:54:58.841120 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:54:58.841080 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:55:08.841348 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:55:08.841307 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:55:18.840980 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:55:18.840939 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:55:28.841457 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:55:28.841414 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:55:38.841396 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:55:38.841344 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:55:48.840640 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:55:48.840597 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:55:58.169433 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:55:58.169346 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:56:02.909377 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.909343 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm"] Apr 22 19:56:02.909889 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.909655 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" containerID="cri-o://f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6" gracePeriod=30 Apr 22 19:56:02.977334 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.977308 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8"] Apr 22 19:56:02.977629 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.977617 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" Apr 22 19:56:02.977695 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.977631 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" Apr 22 19:56:02.977695 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.977654 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="storage-initializer" Apr 22 19:56:02.977695 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.977675 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="storage-initializer" Apr 22 19:56:02.977793 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.977727 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="015800d3-a3f1-4763-bb27-8dd913884403" containerName="kserve-container" Apr 22 19:56:02.980706 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.980690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:56:02.989643 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:02.989616 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8"] Apr 22 19:56:03.003378 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:03.003357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa5d47f5-6ba6-4cbb-b240-8c611023d306-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8\" (UID: \"fa5d47f5-6ba6-4cbb-b240-8c611023d306\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:56:03.104459 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:03.104438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa5d47f5-6ba6-4cbb-b240-8c611023d306-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8\" (UID: \"fa5d47f5-6ba6-4cbb-b240-8c611023d306\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:56:03.104799 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:03.104783 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa5d47f5-6ba6-4cbb-b240-8c611023d306-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8\" (UID: \"fa5d47f5-6ba6-4cbb-b240-8c611023d306\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:56:03.291259 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:03.291191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:56:03.403790 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:03.403765 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8"] Apr 22 19:56:03.406144 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:56:03.406116 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5d47f5_6ba6_4cbb_b240_8c611023d306.slice/crio-09c389df9ac526fdb5c4ff6b2141fa92161719672fdeec702504593582abd503 WatchSource:0}: Error finding container 09c389df9ac526fdb5c4ff6b2141fa92161719672fdeec702504593582abd503: Status 404 returned error can't find the container with id 09c389df9ac526fdb5c4ff6b2141fa92161719672fdeec702504593582abd503 Apr 22 19:56:04.083837 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:04.083804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" event={"ID":"fa5d47f5-6ba6-4cbb-b240-8c611023d306","Type":"ContainerStarted","Data":"631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01"} Apr 22 19:56:04.083837 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:04.083840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" event={"ID":"fa5d47f5-6ba6-4cbb-b240-8c611023d306","Type":"ContainerStarted","Data":"09c389df9ac526fdb5c4ff6b2141fa92161719672fdeec702504593582abd503"} Apr 22 19:56:06.742847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:06.742824 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:56:06.829331 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:06.829272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac5edabb-d258-4679-814d-9355f7ecdd50-kserve-provision-location\") pod \"ac5edabb-d258-4679-814d-9355f7ecdd50\" (UID: \"ac5edabb-d258-4679-814d-9355f7ecdd50\") " Apr 22 19:56:06.829566 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:06.829543 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5edabb-d258-4679-814d-9355f7ecdd50-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ac5edabb-d258-4679-814d-9355f7ecdd50" (UID: "ac5edabb-d258-4679-814d-9355f7ecdd50"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:56:06.929939 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:06.929916 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac5edabb-d258-4679-814d-9355f7ecdd50-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:56:07.094156 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.094074 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerID="f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6" exitCode=0 Apr 22 19:56:07.094156 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.094144 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" Apr 22 19:56:07.094312 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.094144 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" event={"ID":"ac5edabb-d258-4679-814d-9355f7ecdd50","Type":"ContainerDied","Data":"f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6"} Apr 22 19:56:07.094312 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.094188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm" event={"ID":"ac5edabb-d258-4679-814d-9355f7ecdd50","Type":"ContainerDied","Data":"6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2"} Apr 22 19:56:07.094312 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.094219 2572 scope.go:117] "RemoveContainer" containerID="f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6" Apr 22 19:56:07.103474 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.103457 2572 scope.go:117] "RemoveContainer" containerID="7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b" Apr 22 19:56:07.107707 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:56:07.107686 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5edabb_d258_4679_814d_9355f7ecdd50.slice/crio-6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2\": RecentStats: unable to find data in memory cache]" Apr 22 19:56:07.107789 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:56:07.107763 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5edabb_d258_4679_814d_9355f7ecdd50.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5edabb_d258_4679_814d_9355f7ecdd50.slice/crio-6bb115313a794eba2afd6100f262c28a5c3b8f13550ea92577b0a498424c23f2\": RecentStats: unable to find data in memory cache]" Apr 22 19:56:07.117467 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.117440 2572 scope.go:117] "RemoveContainer" containerID="f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6" Apr 22 19:56:07.117964 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:56:07.117936 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6\": container with ID starting with f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6 not found: ID does not exist" containerID="f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6" Apr 22 19:56:07.118063 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.117974 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6"} err="failed to get container status \"f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6\": rpc error: code = NotFound desc = could not find container \"f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6\": container with ID starting with f702d20b8a5b5a940a67571c7c7382f9ffe3b865e4404f9b1814d41ac8332db6 not found: ID does not exist" Apr 22 19:56:07.118063 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.117999 2572 scope.go:117] "RemoveContainer" containerID="7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b" Apr 22 19:56:07.118441 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:56:07.118420 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b\": container with ID starting with 7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b not found: ID does not exist" containerID="7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b" Apr 22 19:56:07.118519 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.118445 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b"} err="failed to get container status \"7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b\": rpc error: code = NotFound desc = could not find container \"7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b\": container with ID starting with 7c48f5d1d0d42c209ca4a34c867441d46e2ec55c65df2b63762df2ccdc96791b not found: ID does not exist" Apr 22 19:56:07.130150 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.130122 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm"] Apr 22 19:56:07.136825 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:07.136804 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wjntm"] Apr 22 19:56:08.098941 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:08.098907 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerID="631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01" exitCode=0 Apr 22 19:56:08.099392 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:08.098988 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" event={"ID":"fa5d47f5-6ba6-4cbb-b240-8c611023d306","Type":"ContainerDied","Data":"631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01"} Apr 22 19:56:08.169414 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:08.169381 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" path="/var/lib/kubelet/pods/ac5edabb-d258-4679-814d-9355f7ecdd50/volumes" Apr 22 19:56:09.104921 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:09.104887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" event={"ID":"fa5d47f5-6ba6-4cbb-b240-8c611023d306","Type":"ContainerStarted","Data":"cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8"} Apr 22 19:56:09.105321 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:09.105206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:56:09.106249 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:09.106224 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:56:09.120677 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:09.120626 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podStartSLOduration=7.120614487 podStartE2EDuration="7.120614487s" podCreationTimestamp="2026-04-22 19:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:56:09.119291878 +0000 UTC m=+1983.528706413" watchObservedRunningTime="2026-04-22 19:56:09.120614487 +0000 UTC m=+1983.530029000" Apr 22 19:56:10.108778 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:10.108736 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:56:20.108960 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:20.108914 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:56:30.109241 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:30.109201 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:56:40.109021 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:40.108977 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:56:50.109052 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:56:50.109007 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:57:00.109566 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:00.109522 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:57:10.109340 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:10.109294 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:57:20.109118 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:20.109070 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:57:30.110700 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:30.110603 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:57:34.305102 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.305066 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8"] Apr 22 19:57:34.305462 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.305301 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" containerID="cri-o://cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8" gracePeriod=30 Apr 22 19:57:34.366989 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.366962 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl"] Apr 22 19:57:34.367271 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.367259 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="storage-initializer" Apr 22 19:57:34.367315 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.367273 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="storage-initializer" Apr 22 19:57:34.367315 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.367288 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" Apr 22 19:57:34.367315 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.367294 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" Apr 22 19:57:34.367403 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.367350 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac5edabb-d258-4679-814d-9355f7ecdd50" containerName="kserve-container" Apr 22 19:57:34.370339 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.370326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:57:34.377522 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.377500 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl"] Apr 22 19:57:34.415509 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.415482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47ac460b-2bf5-467d-ad27-b918a47ae1ab-kserve-provision-location\") pod \"isvc-primary-a85f27-predictor-85cb7c449c-rhzwl\" (UID: \"47ac460b-2bf5-467d-ad27-b918a47ae1ab\") " pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:57:34.515943 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.515917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47ac460b-2bf5-467d-ad27-b918a47ae1ab-kserve-provision-location\") pod \"isvc-primary-a85f27-predictor-85cb7c449c-rhzwl\" (UID: \"47ac460b-2bf5-467d-ad27-b918a47ae1ab\") " pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:57:34.516268 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.516250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47ac460b-2bf5-467d-ad27-b918a47ae1ab-kserve-provision-location\") pod \"isvc-primary-a85f27-predictor-85cb7c449c-rhzwl\" (UID: \"47ac460b-2bf5-467d-ad27-b918a47ae1ab\") " pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:57:34.680068 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.680039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:57:34.794815 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:34.794757 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl"] Apr 22 19:57:34.797220 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:57:34.797190 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ac460b_2bf5_467d_ad27_b918a47ae1ab.slice/crio-4e339f1fd1ae2250c41033d899f2af0adbefaf1464b05280d0d56fdfd58863de WatchSource:0}: Error finding container 4e339f1fd1ae2250c41033d899f2af0adbefaf1464b05280d0d56fdfd58863de: Status 404 returned error can't find the container with id 4e339f1fd1ae2250c41033d899f2af0adbefaf1464b05280d0d56fdfd58863de Apr 22 19:57:35.345292 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:35.345256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" event={"ID":"47ac460b-2bf5-467d-ad27-b918a47ae1ab","Type":"ContainerStarted","Data":"65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d"} Apr 22 19:57:35.345292 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:35.345292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" event={"ID":"47ac460b-2bf5-467d-ad27-b918a47ae1ab","Type":"ContainerStarted","Data":"4e339f1fd1ae2250c41033d899f2af0adbefaf1464b05280d0d56fdfd58863de"} Apr 22 19:57:37.351278 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.351252 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:57:37.351869 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.351847 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerID="cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8" exitCode=0 Apr 22 19:57:37.351954 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.351903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" event={"ID":"fa5d47f5-6ba6-4cbb-b240-8c611023d306","Type":"ContainerDied","Data":"cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8"} Apr 22 19:57:37.351954 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.351928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" event={"ID":"fa5d47f5-6ba6-4cbb-b240-8c611023d306","Type":"ContainerDied","Data":"09c389df9ac526fdb5c4ff6b2141fa92161719672fdeec702504593582abd503"} Apr 22 19:57:37.351954 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.351942 2572 scope.go:117] "RemoveContainer" containerID="cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8" Apr 22 19:57:37.360196 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.360178 2572 scope.go:117] "RemoveContainer" containerID="631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01" Apr 22 19:57:37.366745 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.366726 2572 scope.go:117] "RemoveContainer" containerID="cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8" Apr 22 19:57:37.367720 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:57:37.367693 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8\": container with ID starting with cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8 not found: ID does not exist" containerID="cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8" Apr 22 19:57:37.367861 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.367749 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8"} err="failed to get container status \"cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8\": rpc error: code = NotFound desc = could not find container \"cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8\": container with ID starting with cf65160194c1de1ff2e9903c25e6ae80c4f3ed8d5d713b00eae1c3da48f329d8 not found: ID does not exist" Apr 22 19:57:37.367861 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.367777 2572 scope.go:117] "RemoveContainer" containerID="631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01" Apr 22 19:57:37.368859 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:57:37.368832 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01\": container with ID starting with 631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01 not found: ID does not exist" containerID="631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01" Apr 22 19:57:37.368956 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.368870 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01"} err="failed to get container status \"631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01\": rpc error: code = NotFound desc = could not find container \"631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01\": container with ID starting with 631dba134cca0b2a6d3d192df9fe43ed4aef3c8f238b139d58222024aa396a01 not found: ID does not exist" Apr 22 19:57:37.434051 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.434017 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa5d47f5-6ba6-4cbb-b240-8c611023d306-kserve-provision-location\") pod \"fa5d47f5-6ba6-4cbb-b240-8c611023d306\" (UID: \"fa5d47f5-6ba6-4cbb-b240-8c611023d306\") " Apr 22 19:57:37.434355 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.434331 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5d47f5-6ba6-4cbb-b240-8c611023d306-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fa5d47f5-6ba6-4cbb-b240-8c611023d306" (UID: "fa5d47f5-6ba6-4cbb-b240-8c611023d306"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:57:37.535278 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:37.535243 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa5d47f5-6ba6-4cbb-b240-8c611023d306-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:57:38.356100 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:38.356076 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8" Apr 22 19:57:38.372455 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:38.372427 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8"] Apr 22 19:57:38.375202 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:38.375179 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-c58v8"] Apr 22 19:57:39.360761 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:39.360727 2572 generic.go:358] "Generic (PLEG): container finished" podID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerID="65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d" exitCode=0 Apr 22 19:57:39.361145 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:39.360799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" event={"ID":"47ac460b-2bf5-467d-ad27-b918a47ae1ab","Type":"ContainerDied","Data":"65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d"} Apr 22 19:57:40.169945 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:40.169915 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" path="/var/lib/kubelet/pods/fa5d47f5-6ba6-4cbb-b240-8c611023d306/volumes" Apr 22 19:57:40.367349 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:40.367323 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" event={"ID":"47ac460b-2bf5-467d-ad27-b918a47ae1ab","Type":"ContainerStarted","Data":"8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18"} Apr 22 19:57:40.367712 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:40.367578 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:57:40.368861 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:40.368837 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:57:40.383315 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:40.383273 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podStartSLOduration=6.383260941 podStartE2EDuration="6.383260941s" podCreationTimestamp="2026-04-22 19:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:40.381881119 +0000 UTC m=+2074.791295632" watchObservedRunningTime="2026-04-22 19:57:40.383260941 +0000 UTC m=+2074.792675516" Apr 22 19:57:41.370247 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:41.370210 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:57:51.370903 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:57:51.370865 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:58:01.370897 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:01.370855 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:58:06.248148 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:06.248123 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:58:06.261118 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:06.261098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 19:58:11.370975 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:11.370934 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:58:21.370564 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:21.370521 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:58:31.370386 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:31.370348 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:58:41.371831 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:41.371800 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:58:44.498427 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.498397 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv"] Apr 22 19:58:44.498818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.498701 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" Apr 22 19:58:44.498818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.498714 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" Apr 22 19:58:44.498818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.498749 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="storage-initializer" Apr 22 19:58:44.498818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.498759 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="storage-initializer" Apr 22 19:58:44.498818 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.498808 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa5d47f5-6ba6-4cbb-b240-8c611023d306" containerName="kserve-container" Apr 22 19:58:44.501789 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.501771 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.504195 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.504176 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-a85f27\"" Apr 22 19:58:44.504353 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.504337 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-a85f27-dockercfg-wfcpt\"" Apr 22 19:58:44.504398 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.504391 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:58:44.510941 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.510918 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv"] Apr 22 19:58:44.590847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.590814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef386ed5-208e-43a6-a040-163d28455851-kserve-provision-location\") pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.590991 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.590865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ef386ed5-208e-43a6-a040-163d28455851-cabundle-cert\") pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.691927 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.691898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef386ed5-208e-43a6-a040-163d28455851-kserve-provision-location\") pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.692040 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.691935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ef386ed5-208e-43a6-a040-163d28455851-cabundle-cert\") pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.692281 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.692265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef386ed5-208e-43a6-a040-163d28455851-kserve-provision-location\") pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.692439 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.692424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ef386ed5-208e-43a6-a040-163d28455851-cabundle-cert\") pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.812224 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.812162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:58:44.924831 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.924796 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv"] Apr 22 19:58:44.927947 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:58:44.927924 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef386ed5_208e_43a6_a040_163d28455851.slice/crio-c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec WatchSource:0}: Error finding container c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec: Status 404 returned error can't find the container with id c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec Apr 22 19:58:44.930128 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:44.930113 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:58:45.542443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:45.542411 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" event={"ID":"ef386ed5-208e-43a6-a040-163d28455851","Type":"ContainerStarted","Data":"5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7"} Apr 22 19:58:45.542443 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:45.542446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" event={"ID":"ef386ed5-208e-43a6-a040-163d28455851","Type":"ContainerStarted","Data":"c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec"} Apr 22 19:58:48.551992 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:48.551963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/0.log" Apr 22 19:58:48.552347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:48.552004 2572 generic.go:358] "Generic (PLEG): container finished" podID="ef386ed5-208e-43a6-a040-163d28455851" containerID="5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7" exitCode=1 Apr 22 19:58:48.552347 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:48.552032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" event={"ID":"ef386ed5-208e-43a6-a040-163d28455851","Type":"ContainerDied","Data":"5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7"} Apr 22 19:58:49.556352 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:49.556324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/0.log" Apr 22 19:58:49.556742 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:49.556411 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" event={"ID":"ef386ed5-208e-43a6-a040-163d28455851","Type":"ContainerStarted","Data":"48fbe651c7d1fd65414ecf93e63ffe65e462776a1001a27cb418a231488f398a"} Apr 22 19:58:54.571028 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:54.570998 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/1.log" Apr 22 19:58:54.571386 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:54.571326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/0.log" Apr 22 19:58:54.571386 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:54.571357 2572 generic.go:358] "Generic (PLEG): container finished" podID="ef386ed5-208e-43a6-a040-163d28455851" containerID="48fbe651c7d1fd65414ecf93e63ffe65e462776a1001a27cb418a231488f398a" exitCode=1 Apr 22 19:58:54.571466 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:54.571443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" event={"ID":"ef386ed5-208e-43a6-a040-163d28455851","Type":"ContainerDied","Data":"48fbe651c7d1fd65414ecf93e63ffe65e462776a1001a27cb418a231488f398a"} Apr 22 19:58:54.571505 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:54.571492 2572 scope.go:117] "RemoveContainer" containerID="5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7" Apr 22 19:58:54.571825 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:54.571809 2572 scope.go:117] "RemoveContainer" containerID="5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7" Apr 22 19:58:54.581216 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:58:54.581190 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_kserve-ci-e2e-test_ef386ed5-208e-43a6-a040-163d28455851_0 in pod sandbox c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec from index: no such id: '5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7'" containerID="5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7" Apr 22 19:58:54.581280 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:58:54.581234 2572 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_kserve-ci-e2e-test_ef386ed5-208e-43a6-a040-163d28455851_0 in pod sandbox c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec from index: no such id: '5c84bb21844bf10e30d24e50143e8a4c36d668dd0cdc0fe916da54a483452cc7'; Skipping pod \"isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_kserve-ci-e2e-test(ef386ed5-208e-43a6-a040-163d28455851)\"" logger="UnhandledError" Apr 22 19:58:54.582524 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:58:54.582506 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_kserve-ci-e2e-test(ef386ed5-208e-43a6-a040-163d28455851)\"" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" podUID="ef386ed5-208e-43a6-a040-163d28455851" Apr 22 19:58:55.575224 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:58:55.575197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/1.log" Apr 22 19:59:00.587440 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.587399 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv"] Apr 22 19:59:00.625835 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.625796 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl"] Apr 22 19:59:00.626472 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.626260 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" containerID="cri-o://8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18" gracePeriod=30 Apr 22 19:59:00.691249 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.691225 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s"] Apr 22 19:59:00.695569 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.695554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:00.697953 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.697933 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-d0c23b\"" Apr 22 19:59:00.698112 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.698093 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-d0c23b-dockercfg-j5ptt\"" Apr 22 19:59:00.704872 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.704851 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s"] Apr 22 19:59:00.721607 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.721590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/1.log" Apr 22 19:59:00.721722 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.721654 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:59:00.801725 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.801697 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef386ed5-208e-43a6-a040-163d28455851-kserve-provision-location\") pod \"ef386ed5-208e-43a6-a040-163d28455851\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " Apr 22 19:59:00.801839 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.801787 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ef386ed5-208e-43a6-a040-163d28455851-cabundle-cert\") pod \"ef386ed5-208e-43a6-a040-163d28455851\" (UID: \"ef386ed5-208e-43a6-a040-163d28455851\") " Apr 22 19:59:00.801940 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.801926 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef386ed5-208e-43a6-a040-163d28455851-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ef386ed5-208e-43a6-a040-163d28455851" (UID: "ef386ed5-208e-43a6-a040-163d28455851"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:00.801999 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.801960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a0ad5f51-fd56-441b-9334-f15fa44d96a4-cabundle-cert\") pod \"isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:00.802054 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.802006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0ad5f51-fd56-441b-9334-f15fa44d96a4-kserve-provision-location\") pod \"isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:00.802110 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.802084 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef386ed5-208e-43a6-a040-163d28455851-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ef386ed5-208e-43a6-a040-163d28455851" (UID: "ef386ed5-208e-43a6-a040-163d28455851"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:00.802162 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.802144 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef386ed5-208e-43a6-a040-163d28455851-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:59:00.902597 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.902576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a0ad5f51-fd56-441b-9334-f15fa44d96a4-cabundle-cert\") pod \"isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:00.902735 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.902608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0ad5f51-fd56-441b-9334-f15fa44d96a4-kserve-provision-location\") pod \"isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:00.902735 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.902638 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ef386ed5-208e-43a6-a040-163d28455851-cabundle-cert\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:59:00.902928 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.902913 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0ad5f51-fd56-441b-9334-f15fa44d96a4-kserve-provision-location\") pod \"isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:00.903246 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:00.903225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a0ad5f51-fd56-441b-9334-f15fa44d96a4-cabundle-cert\") pod \"isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:01.006569 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.006547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:01.123588 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.123564 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s"] Apr 22 19:59:01.126065 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:59:01.126038 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ad5f51_fd56_441b_9334_f15fa44d96a4.slice/crio-319e7138e37e252846d20ffb7cc69decd1e5ab9ad2488c2b87668763d9e7f5e1 WatchSource:0}: Error finding container 319e7138e37e252846d20ffb7cc69decd1e5ab9ad2488c2b87668763d9e7f5e1: Status 404 returned error can't find the container with id 319e7138e37e252846d20ffb7cc69decd1e5ab9ad2488c2b87668763d9e7f5e1 Apr 22 19:59:01.370480 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.370444 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:59:01.597931 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.597850 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a85f27-predictor-6d95d76444-gzgxv_ef386ed5-208e-43a6-a040-163d28455851/storage-initializer/1.log" Apr 22 19:59:01.598438 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.597965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" event={"ID":"ef386ed5-208e-43a6-a040-163d28455851","Type":"ContainerDied","Data":"c88fcd21a460f3bdb1b7246e5923b505c4c130907c7ed3ad658ee0004cdc54ec"} Apr 22 19:59:01.598438 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.597979 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv" Apr 22 19:59:01.598438 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.598003 2572 scope.go:117] "RemoveContainer" containerID="48fbe651c7d1fd65414ecf93e63ffe65e462776a1001a27cb418a231488f398a" Apr 22 19:59:01.599461 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.599436 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" event={"ID":"a0ad5f51-fd56-441b-9334-f15fa44d96a4","Type":"ContainerStarted","Data":"be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e"} Apr 22 19:59:01.599564 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.599472 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" event={"ID":"a0ad5f51-fd56-441b-9334-f15fa44d96a4","Type":"ContainerStarted","Data":"319e7138e37e252846d20ffb7cc69decd1e5ab9ad2488c2b87668763d9e7f5e1"} Apr 22 19:59:01.639019 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.638997 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv"] Apr 22 19:59:01.643343 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:01.643320 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a85f27-predictor-6d95d76444-gzgxv"] Apr 22 19:59:02.170176 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:02.170143 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef386ed5-208e-43a6-a040-163d28455851" path="/var/lib/kubelet/pods/ef386ed5-208e-43a6-a040-163d28455851/volumes" Apr 22 19:59:04.362414 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.362394 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:59:04.528148 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.528123 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47ac460b-2bf5-467d-ad27-b918a47ae1ab-kserve-provision-location\") pod \"47ac460b-2bf5-467d-ad27-b918a47ae1ab\" (UID: \"47ac460b-2bf5-467d-ad27-b918a47ae1ab\") " Apr 22 19:59:04.528421 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.528397 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ac460b-2bf5-467d-ad27-b918a47ae1ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "47ac460b-2bf5-467d-ad27-b918a47ae1ab" (UID: "47ac460b-2bf5-467d-ad27-b918a47ae1ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:04.610415 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.610382 2572 generic.go:358] "Generic (PLEG): container finished" podID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerID="8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18" exitCode=0 Apr 22 19:59:04.610543 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.610471 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" Apr 22 19:59:04.610620 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.610467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" event={"ID":"47ac460b-2bf5-467d-ad27-b918a47ae1ab","Type":"ContainerDied","Data":"8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18"} Apr 22 19:59:04.610751 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.610632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl" event={"ID":"47ac460b-2bf5-467d-ad27-b918a47ae1ab","Type":"ContainerDied","Data":"4e339f1fd1ae2250c41033d899f2af0adbefaf1464b05280d0d56fdfd58863de"} Apr 22 19:59:04.610751 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.610675 2572 scope.go:117] "RemoveContainer" containerID="8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18" Apr 22 19:59:04.611797 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.611780 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s_a0ad5f51-fd56-441b-9334-f15fa44d96a4/storage-initializer/0.log" Apr 22 19:59:04.611878 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.611820 2572 generic.go:358] "Generic (PLEG): container finished" podID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerID="be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e" exitCode=1 Apr 22 19:59:04.611878 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.611874 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" event={"ID":"a0ad5f51-fd56-441b-9334-f15fa44d96a4","Type":"ContainerDied","Data":"be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e"} Apr 22 19:59:04.617886 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.617866 2572 scope.go:117] "RemoveContainer" containerID="65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d" Apr 22 19:59:04.625044 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.625026 2572 scope.go:117] "RemoveContainer" containerID="8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18" Apr 22 19:59:04.625298 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:59:04.625281 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18\": container with ID starting with 8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18 not found: ID does not exist" containerID="8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18" Apr 22 19:59:04.625367 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.625305 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18"} err="failed to get container status \"8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18\": rpc error: code = NotFound desc = could not find container \"8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18\": container with ID starting with 8f2f8f948a0ca4716e98b583ae763071718c1c92363efa1807b8ad4530f4ea18 not found: ID does not exist" Apr 22 19:59:04.625367 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.625323 2572 scope.go:117] "RemoveContainer" containerID="65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d" Apr 22 19:59:04.625574 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:59:04.625559 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d\": container with ID starting with 65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d not found: ID does not exist" containerID="65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d" Apr 22 19:59:04.625633 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.625577 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d"} err="failed to get container status \"65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d\": rpc error: code = NotFound desc = could not find container \"65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d\": container with ID starting with 65a29f50edf8076a5074f0e510060edda50468bf1f46e911318123e59666f32d not found: ID does not exist" Apr 22 19:59:04.628772 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.628740 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47ac460b-2bf5-467d-ad27-b918a47ae1ab-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:59:04.639688 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.639653 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl"] Apr 22 19:59:04.643469 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:04.643449 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a85f27-predictor-85cb7c449c-rhzwl"] Apr 22 19:59:05.616697 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.616656 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s_a0ad5f51-fd56-441b-9334-f15fa44d96a4/storage-initializer/0.log" Apr 22 19:59:05.617104 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.616730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" event={"ID":"a0ad5f51-fd56-441b-9334-f15fa44d96a4","Type":"ContainerStarted","Data":"a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986"} Apr 22 19:59:05.705460 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.705433 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s"] Apr 22 19:59:05.810515 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810484 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk"] Apr 22 19:59:05.810795 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810783 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef386ed5-208e-43a6-a040-163d28455851" containerName="storage-initializer" Apr 22 19:59:05.810847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810797 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef386ed5-208e-43a6-a040-163d28455851" containerName="storage-initializer" Apr 22 19:59:05.810847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810824 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="storage-initializer" Apr 22 19:59:05.810847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810830 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="storage-initializer" Apr 22 19:59:05.810847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810838 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" Apr 22 19:59:05.810847 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810843 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" Apr 22 19:59:05.810994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810882 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" containerName="kserve-container" Apr 22 19:59:05.810994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810894 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef386ed5-208e-43a6-a040-163d28455851" containerName="storage-initializer" Apr 22 19:59:05.810994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810901 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef386ed5-208e-43a6-a040-163d28455851" containerName="storage-initializer" Apr 22 19:59:05.810994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810947 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef386ed5-208e-43a6-a040-163d28455851" containerName="storage-initializer" Apr 22 19:59:05.810994 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.810953 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef386ed5-208e-43a6-a040-163d28455851" containerName="storage-initializer" Apr 22 19:59:05.813819 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.813800 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 19:59:05.816212 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.816193 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5f4v\"" Apr 22 19:59:05.823165 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.823142 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk"] Apr 22 19:59:05.938124 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:05.938097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdc9043e-44bc-41c8-a83b-0209df6c028b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-cvkxk\" (UID: \"cdc9043e-44bc-41c8-a83b-0209df6c028b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 19:59:06.039195 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.039165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdc9043e-44bc-41c8-a83b-0209df6c028b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-cvkxk\" (UID: \"cdc9043e-44bc-41c8-a83b-0209df6c028b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 19:59:06.039510 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.039491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdc9043e-44bc-41c8-a83b-0209df6c028b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-cvkxk\" (UID: \"cdc9043e-44bc-41c8-a83b-0209df6c028b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 19:59:06.126749 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.126726 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5f4v\"" Apr 22 19:59:06.134806 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.134789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 19:59:06.170521 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.170496 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ac460b-2bf5-467d-ad27-b918a47ae1ab" path="/var/lib/kubelet/pods/47ac460b-2bf5-467d-ad27-b918a47ae1ab/volumes" Apr 22 19:59:06.251833 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.251800 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk"] Apr 22 19:59:06.253792 ip-10-0-134-231 kubenswrapper[2572]: W0422 19:59:06.253768 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc9043e_44bc_41c8_a83b_0209df6c028b.slice/crio-aa71d41f212cf0f159277c3fc89c956ec784c77c22f0a0f5bebb5501a4329e8f WatchSource:0}: Error finding container aa71d41f212cf0f159277c3fc89c956ec784c77c22f0a0f5bebb5501a4329e8f: Status 404 returned error can't find the container with id aa71d41f212cf0f159277c3fc89c956ec784c77c22f0a0f5bebb5501a4329e8f Apr 22 19:59:06.621175 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.621096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" event={"ID":"cdc9043e-44bc-41c8-a83b-0209df6c028b","Type":"ContainerStarted","Data":"0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff"} Apr 22 19:59:06.621175 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.621137 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" event={"ID":"cdc9043e-44bc-41c8-a83b-0209df6c028b","Type":"ContainerStarted","Data":"aa71d41f212cf0f159277c3fc89c956ec784c77c22f0a0f5bebb5501a4329e8f"} Apr 22 19:59:06.621632 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:06.621345 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" containerID="cri-o://a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986" gracePeriod=30 Apr 22 19:59:09.960656 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:09.960633 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s_a0ad5f51-fd56-441b-9334-f15fa44d96a4/storage-initializer/1.log" Apr 22 19:59:09.961019 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:09.961004 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s_a0ad5f51-fd56-441b-9334-f15fa44d96a4/storage-initializer/0.log" Apr 22 19:59:09.961074 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:09.961064 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:10.067557 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.067496 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0ad5f51-fd56-441b-9334-f15fa44d96a4-kserve-provision-location\") pod \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " Apr 22 19:59:10.067702 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.067603 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a0ad5f51-fd56-441b-9334-f15fa44d96a4-cabundle-cert\") pod \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\" (UID: \"a0ad5f51-fd56-441b-9334-f15fa44d96a4\") " Apr 22 19:59:10.068313 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.068278 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0ad5f51-fd56-441b-9334-f15fa44d96a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0ad5f51-fd56-441b-9334-f15fa44d96a4" (UID: "a0ad5f51-fd56-441b-9334-f15fa44d96a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:10.068411 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.068391 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ad5f51-fd56-441b-9334-f15fa44d96a4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a0ad5f51-fd56-441b-9334-f15fa44d96a4" (UID: "a0ad5f51-fd56-441b-9334-f15fa44d96a4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:10.168674 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.168638 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a0ad5f51-fd56-441b-9334-f15fa44d96a4-cabundle-cert\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:59:10.168796 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.168678 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0ad5f51-fd56-441b-9334-f15fa44d96a4-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 19:59:10.633854 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.633832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s_a0ad5f51-fd56-441b-9334-f15fa44d96a4/storage-initializer/1.log" Apr 22 19:59:10.634199 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.634181 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s_a0ad5f51-fd56-441b-9334-f15fa44d96a4/storage-initializer/0.log" Apr 22 19:59:10.634285 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.634224 2572 generic.go:358] "Generic (PLEG): container finished" podID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerID="a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986" exitCode=1 Apr 22 19:59:10.634351 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.634294 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" Apr 22 19:59:10.634351 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.634313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" event={"ID":"a0ad5f51-fd56-441b-9334-f15fa44d96a4","Type":"ContainerDied","Data":"a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986"} Apr 22 19:59:10.634427 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.634360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s" event={"ID":"a0ad5f51-fd56-441b-9334-f15fa44d96a4","Type":"ContainerDied","Data":"319e7138e37e252846d20ffb7cc69decd1e5ab9ad2488c2b87668763d9e7f5e1"} Apr 22 19:59:10.634427 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.634385 2572 scope.go:117] "RemoveContainer" containerID="a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986" Apr 22 19:59:10.635916 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.635887 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerID="0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff" exitCode=0 Apr 22 19:59:10.635989 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.635943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" event={"ID":"cdc9043e-44bc-41c8-a83b-0209df6c028b","Type":"ContainerDied","Data":"0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff"} Apr 22 19:59:10.645258 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.645239 2572 scope.go:117] "RemoveContainer" containerID="be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e" Apr 22 19:59:10.652611 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.652595 2572 scope.go:117] "RemoveContainer" containerID="a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986" Apr 22 19:59:10.652891 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:59:10.652872 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986\": container with ID starting with a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986 not found: ID does not exist" containerID="a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986" Apr 22 19:59:10.652968 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.652899 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986"} err="failed to get container status \"a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986\": rpc error: code = NotFound desc = could not find container \"a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986\": container with ID starting with a46805fd349946a7cd0bad8b0080a4540b018ce9b9fe3e86d851244f05950986 not found: ID does not exist" Apr 22 19:59:10.652968 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.652917 2572 scope.go:117] "RemoveContainer" containerID="be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e" Apr 22 19:59:10.653170 ip-10-0-134-231 kubenswrapper[2572]: E0422 19:59:10.653148 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e\": container with ID starting with be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e not found: ID does not exist" containerID="be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e" Apr 22 19:59:10.653229 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.653180 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e"} err="failed to get container status \"be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e\": rpc error: code = NotFound desc = could not find container \"be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e\": container with ID starting with be01e8885a94d7fe13c8bc3b7f0c1da591fda938fa1384ae189946742d79091e not found: ID does not exist" Apr 22 19:59:10.666058 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.666037 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s"] Apr 22 19:59:10.671784 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:10.671762 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d0c23b-predictor-f8f8bdfb6-lf97s"] Apr 22 19:59:12.170076 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:12.170045 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" path="/var/lib/kubelet/pods/a0ad5f51-fd56-441b-9334-f15fa44d96a4/volumes" Apr 22 19:59:30.699328 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:30.699296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" event={"ID":"cdc9043e-44bc-41c8-a83b-0209df6c028b","Type":"ContainerStarted","Data":"705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee"} Apr 22 19:59:30.699701 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:30.699658 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 19:59:30.700696 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:30.700658 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 19:59:30.717413 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:30.717362 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podStartSLOduration=5.882930254 podStartE2EDuration="25.717349027s" podCreationTimestamp="2026-04-22 19:59:05 +0000 UTC" firstStartedPulling="2026-04-22 19:59:10.637233187 +0000 UTC m=+2165.046647678" lastFinishedPulling="2026-04-22 19:59:30.471651948 +0000 UTC m=+2184.881066451" observedRunningTime="2026-04-22 19:59:30.716290236 +0000 UTC m=+2185.125704750" watchObservedRunningTime="2026-04-22 19:59:30.717349027 +0000 UTC m=+2185.126763540" Apr 22 19:59:31.702685 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:31.702633 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 19:59:41.703516 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:41.703474 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 19:59:51.703498 ip-10-0-134-231 kubenswrapper[2572]: I0422 19:59:51.703460 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:00:01.702985 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:01.702945 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:00:11.703733 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:11.703692 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:00:21.702881 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:21.702837 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:00:31.703601 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:31.703519 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:00:41.703849 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:41.703812 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 20:00:46.003032 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.003003 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk"] Apr 22 20:00:46.003478 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.003220 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" containerID="cri-o://705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee" gracePeriod=30 Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.069928 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g"] Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.070620 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.070635 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.070681 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.070691 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.070805 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" Apr 22 20:00:46.072422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.071057 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0ad5f51-fd56-441b-9334-f15fa44d96a4" containerName="storage-initializer" Apr 22 20:00:46.074689 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.074648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:00:46.079742 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.079716 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g"] Apr 22 20:00:46.160169 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.160142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a951a6aa-be67-4baf-89ea-95c51bc2e536-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g\" (UID: \"a951a6aa-be67-4baf-89ea-95c51bc2e536\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:00:46.260890 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.260824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a951a6aa-be67-4baf-89ea-95c51bc2e536-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g\" (UID: \"a951a6aa-be67-4baf-89ea-95c51bc2e536\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:00:46.261178 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.261160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a951a6aa-be67-4baf-89ea-95c51bc2e536-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g\" (UID: \"a951a6aa-be67-4baf-89ea-95c51bc2e536\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:00:46.386612 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.386591 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:00:46.501299 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.501271 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g"] Apr 22 20:00:46.504202 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:00:46.504176 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda951a6aa_be67_4baf_89ea_95c51bc2e536.slice/crio-305c7857e418f164ef2f8baca4052717f81b2f9d35613cb3a0da1ec50b4d7294 WatchSource:0}: Error finding container 305c7857e418f164ef2f8baca4052717f81b2f9d35613cb3a0da1ec50b4d7294: Status 404 returned error can't find the container with id 305c7857e418f164ef2f8baca4052717f81b2f9d35613cb3a0da1ec50b4d7294 Apr 22 20:00:46.919623 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.919591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" event={"ID":"a951a6aa-be67-4baf-89ea-95c51bc2e536","Type":"ContainerStarted","Data":"68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755"} Apr 22 20:00:46.919623 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:46.919626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" event={"ID":"a951a6aa-be67-4baf-89ea-95c51bc2e536","Type":"ContainerStarted","Data":"305c7857e418f164ef2f8baca4052717f81b2f9d35613cb3a0da1ec50b4d7294"} Apr 22 20:00:50.137397 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.137370 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 20:00:50.291229 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.291163 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdc9043e-44bc-41c8-a83b-0209df6c028b-kserve-provision-location\") pod \"cdc9043e-44bc-41c8-a83b-0209df6c028b\" (UID: \"cdc9043e-44bc-41c8-a83b-0209df6c028b\") " Apr 22 20:00:50.291489 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.291465 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc9043e-44bc-41c8-a83b-0209df6c028b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cdc9043e-44bc-41c8-a83b-0209df6c028b" (UID: "cdc9043e-44bc-41c8-a83b-0209df6c028b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:50.391861 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.391839 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdc9043e-44bc-41c8-a83b-0209df6c028b-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:00:50.933532 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.933497 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerID="705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee" exitCode=0 Apr 22 20:00:50.933738 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.933567 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" Apr 22 20:00:50.933738 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.933588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" event={"ID":"cdc9043e-44bc-41c8-a83b-0209df6c028b","Type":"ContainerDied","Data":"705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee"} Apr 22 20:00:50.933738 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.933622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk" event={"ID":"cdc9043e-44bc-41c8-a83b-0209df6c028b","Type":"ContainerDied","Data":"aa71d41f212cf0f159277c3fc89c956ec784c77c22f0a0f5bebb5501a4329e8f"} Apr 22 20:00:50.933738 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.933643 2572 scope.go:117] "RemoveContainer" containerID="705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee" Apr 22 20:00:50.935193 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.935169 2572 generic.go:358] "Generic (PLEG): container finished" podID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerID="68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755" exitCode=0 Apr 22 20:00:50.935305 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.935231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" event={"ID":"a951a6aa-be67-4baf-89ea-95c51bc2e536","Type":"ContainerDied","Data":"68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755"} Apr 22 20:00:50.942537 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.942380 2572 scope.go:117] "RemoveContainer" containerID="0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff" Apr 22 20:00:50.949593 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.949575 2572 scope.go:117] "RemoveContainer" containerID="705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee" Apr 22 20:00:50.949861 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:00:50.949844 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee\": container with ID starting with 705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee not found: ID does not exist" containerID="705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee" Apr 22 20:00:50.949935 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.949868 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee"} err="failed to get container status \"705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee\": rpc error: code = NotFound desc = could not find container \"705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee\": container with ID starting with 705a1f4a50e76fa27b23658f31bda245f6ef561b8ea53f440287296c2d434cee not found: ID does not exist" Apr 22 20:00:50.949935 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.949884 2572 scope.go:117] "RemoveContainer" containerID="0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff" Apr 22 20:00:50.950117 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:00:50.950089 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff\": container with ID starting with 0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff not found: ID does not exist" containerID="0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff" Apr 22 20:00:50.950157 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.950128 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff"} err="failed to get container status \"0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff\": rpc error: code = NotFound desc = could not find container \"0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff\": container with ID starting with 0fa6af53d58ec07473135dae2e4c485e501573f03d92a542b863d631442e3dff not found: ID does not exist" Apr 22 20:00:50.963874 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.963852 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk"] Apr 22 20:00:50.965640 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:50.965618 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cvkxk"] Apr 22 20:00:51.940438 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:51.940409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" event={"ID":"a951a6aa-be67-4baf-89ea-95c51bc2e536","Type":"ContainerStarted","Data":"4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620"} Apr 22 20:00:51.940861 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:51.940695 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:00:51.941793 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:51.941766 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:00:51.957156 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:51.957120 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podStartSLOduration=5.957107255 podStartE2EDuration="5.957107255s" podCreationTimestamp="2026-04-22 20:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:51.955247533 +0000 UTC m=+2266.364662046" watchObservedRunningTime="2026-04-22 20:00:51.957107255 +0000 UTC m=+2266.366521770" Apr 22 20:00:52.169955 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:52.169934 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" path="/var/lib/kubelet/pods/cdc9043e-44bc-41c8-a83b-0209df6c028b/volumes" Apr 22 20:00:52.943740 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:00:52.943704 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:01:02.944485 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:01:02.944444 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:01:12.943883 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:01:12.943840 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:01:22.944616 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:01:22.944579 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:01:32.944718 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:01:32.944659 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:01:42.943986 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:01:42.943941 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:01:52.944693 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:01:52.944638 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:02:02.945754 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:02.945719 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:02:06.196895 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.196862 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g"] Apr 22 20:02:06.197283 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.197186 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" containerID="cri-o://4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620" gracePeriod=30 Apr 22 20:02:06.233109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.233083 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7"] Apr 22 20:02:06.233353 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.233342 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="storage-initializer" Apr 22 20:02:06.233405 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.233355 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="storage-initializer" Apr 22 20:02:06.233405 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.233364 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" Apr 22 20:02:06.233405 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.233370 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" Apr 22 20:02:06.233499 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.233429 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdc9043e-44bc-41c8-a83b-0209df6c028b" containerName="kserve-container" Apr 22 20:02:06.236311 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.236296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:02:06.242791 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.242656 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7"] Apr 22 20:02:06.299190 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.299167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14d145cc-a86d-4cc6-9c94-78077fa200ad-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-h5kh7\" (UID: \"14d145cc-a86d-4cc6-9c94-78077fa200ad\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:02:06.400090 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.400060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14d145cc-a86d-4cc6-9c94-78077fa200ad-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-h5kh7\" (UID: \"14d145cc-a86d-4cc6-9c94-78077fa200ad\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:02:06.400357 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.400339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14d145cc-a86d-4cc6-9c94-78077fa200ad-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-h5kh7\" (UID: \"14d145cc-a86d-4cc6-9c94-78077fa200ad\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:02:06.546832 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.546777 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:02:06.660526 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:06.660489 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7"] Apr 22 20:02:06.664482 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:02:06.664456 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d145cc_a86d_4cc6_9c94_78077fa200ad.slice/crio-c6efa448b60e0e45d738d1d3c0d133b9cb984a2b3e0e7ea831cf84890a5d0204 WatchSource:0}: Error finding container c6efa448b60e0e45d738d1d3c0d133b9cb984a2b3e0e7ea831cf84890a5d0204: Status 404 returned error can't find the container with id c6efa448b60e0e45d738d1d3c0d133b9cb984a2b3e0e7ea831cf84890a5d0204 Apr 22 20:02:07.154627 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:07.154593 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" event={"ID":"14d145cc-a86d-4cc6-9c94-78077fa200ad","Type":"ContainerStarted","Data":"0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e"} Apr 22 20:02:07.154627 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:07.154631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" event={"ID":"14d145cc-a86d-4cc6-9c94-78077fa200ad","Type":"ContainerStarted","Data":"c6efa448b60e0e45d738d1d3c0d133b9cb984a2b3e0e7ea831cf84890a5d0204"} Apr 22 20:02:10.425462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:10.425442 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:02:10.529796 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:10.529772 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a951a6aa-be67-4baf-89ea-95c51bc2e536-kserve-provision-location\") pod \"a951a6aa-be67-4baf-89ea-95c51bc2e536\" (UID: \"a951a6aa-be67-4baf-89ea-95c51bc2e536\") " Apr 22 20:02:10.530076 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:10.530057 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a951a6aa-be67-4baf-89ea-95c51bc2e536-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a951a6aa-be67-4baf-89ea-95c51bc2e536" (UID: "a951a6aa-be67-4baf-89ea-95c51bc2e536"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:02:10.630621 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:10.630596 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a951a6aa-be67-4baf-89ea-95c51bc2e536-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:02:11.168019 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.167988 2572 generic.go:358] "Generic (PLEG): container finished" podID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerID="4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620" exitCode=0 Apr 22 20:02:11.168158 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.168062 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" Apr 22 20:02:11.168158 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.168070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" event={"ID":"a951a6aa-be67-4baf-89ea-95c51bc2e536","Type":"ContainerDied","Data":"4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620"} Apr 22 20:02:11.168158 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.168113 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g" event={"ID":"a951a6aa-be67-4baf-89ea-95c51bc2e536","Type":"ContainerDied","Data":"305c7857e418f164ef2f8baca4052717f81b2f9d35613cb3a0da1ec50b4d7294"} Apr 22 20:02:11.168158 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.168131 2572 scope.go:117] "RemoveContainer" containerID="4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620" Apr 22 20:02:11.169696 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.169649 2572 generic.go:358] "Generic (PLEG): container finished" podID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerID="0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e" exitCode=0 Apr 22 20:02:11.169781 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.169695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" event={"ID":"14d145cc-a86d-4cc6-9c94-78077fa200ad","Type":"ContainerDied","Data":"0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e"} Apr 22 20:02:11.176840 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.176705 2572 scope.go:117] "RemoveContainer" containerID="68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755" Apr 22 20:02:11.183702 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.183685 2572 scope.go:117] "RemoveContainer" containerID="4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620" Apr 22 20:02:11.183980 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:02:11.183963 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620\": container with ID starting with 4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620 not found: ID does not exist" containerID="4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620" Apr 22 20:02:11.184029 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.183989 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620"} err="failed to get container status \"4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620\": rpc error: code = NotFound desc = could not find container \"4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620\": container with ID starting with 4fa80894084dbd78ddc9399414919914a2c52c1eb076b2ce2c306700d3eed620 not found: ID does not exist" Apr 22 20:02:11.184029 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.184007 2572 scope.go:117] "RemoveContainer" containerID="68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755" Apr 22 20:02:11.184256 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:02:11.184235 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755\": container with ID starting with 68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755 not found: ID does not exist" containerID="68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755" Apr 22 20:02:11.184307 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.184267 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755"} err="failed to get container status \"68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755\": rpc error: code = NotFound desc = could not find container \"68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755\": container with ID starting with 68b563b93580c75a6c580d8183d6ce2bc06bd58cac488aad038cb34f079f4755 not found: ID does not exist" Apr 22 20:02:11.201767 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.201745 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g"] Apr 22 20:02:11.204612 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:11.204591 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-h2c6g"] Apr 22 20:02:12.170015 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:12.169983 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" path="/var/lib/kubelet/pods/a951a6aa-be67-4baf-89ea-95c51bc2e536/volumes" Apr 22 20:02:12.174926 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:12.174899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" event={"ID":"14d145cc-a86d-4cc6-9c94-78077fa200ad","Type":"ContainerStarted","Data":"1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe"} Apr 22 20:02:12.175251 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:12.175236 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:02:12.176361 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:12.176335 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:02:12.191803 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:12.191760 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podStartSLOduration=6.191749644 podStartE2EDuration="6.191749644s" podCreationTimestamp="2026-04-22 20:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:02:12.190611907 +0000 UTC m=+2346.600026431" watchObservedRunningTime="2026-04-22 20:02:12.191749644 +0000 UTC m=+2346.601164156" Apr 22 20:02:13.177781 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:13.177742 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:02:23.178023 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:23.177979 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:02:33.178135 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:33.178088 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:02:43.178146 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:43.178106 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:02:53.177934 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:02:53.177895 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:03:03.178613 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:03.178565 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:03:06.267049 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:06.267025 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:03:06.281234 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:06.281213 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:03:13.178233 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:13.178192 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:03:23.179098 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:23.179065 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:03:26.380359 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.380285 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7"] Apr 22 20:03:26.380719 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.380642 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" containerID="cri-o://1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe" gracePeriod=30 Apr 22 20:03:26.426321 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.426296 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w"] Apr 22 20:03:26.426588 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.426577 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="storage-initializer" Apr 22 20:03:26.426635 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.426590 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="storage-initializer" Apr 22 20:03:26.426635 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.426607 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" Apr 22 20:03:26.426635 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.426612 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" Apr 22 20:03:26.426744 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.426675 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a951a6aa-be67-4baf-89ea-95c51bc2e536" containerName="kserve-container" Apr 22 20:03:26.429517 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.429501 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:03:26.441685 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.441649 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w"] Apr 22 20:03:26.441916 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.441896 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f6851de-9402-476a-880e-fb3d177629d2-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w\" (UID: \"0f6851de-9402-476a-880e-fb3d177629d2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:03:26.542252 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.542229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f6851de-9402-476a-880e-fb3d177629d2-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w\" (UID: \"0f6851de-9402-476a-880e-fb3d177629d2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:03:26.542627 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.542606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f6851de-9402-476a-880e-fb3d177629d2-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w\" (UID: \"0f6851de-9402-476a-880e-fb3d177629d2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:03:26.739010 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.738988 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:03:26.855188 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:26.855158 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w"] Apr 22 20:03:26.858071 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:03:26.858037 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6851de_9402_476a_880e_fb3d177629d2.slice/crio-92339faaafff23f29487a4310ae16bc183aa226099cddc6eb3a0b6a2d09aa97a WatchSource:0}: Error finding container 92339faaafff23f29487a4310ae16bc183aa226099cddc6eb3a0b6a2d09aa97a: Status 404 returned error can't find the container with id 92339faaafff23f29487a4310ae16bc183aa226099cddc6eb3a0b6a2d09aa97a Apr 22 20:03:27.379027 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:27.378997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" event={"ID":"0f6851de-9402-476a-880e-fb3d177629d2","Type":"ContainerStarted","Data":"5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5"} Apr 22 20:03:27.379027 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:27.379029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" event={"ID":"0f6851de-9402-476a-880e-fb3d177629d2","Type":"ContainerStarted","Data":"92339faaafff23f29487a4310ae16bc183aa226099cddc6eb3a0b6a2d09aa97a"} Apr 22 20:03:30.505622 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:30.505598 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:03:30.570147 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:30.570123 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14d145cc-a86d-4cc6-9c94-78077fa200ad-kserve-provision-location\") pod \"14d145cc-a86d-4cc6-9c94-78077fa200ad\" (UID: \"14d145cc-a86d-4cc6-9c94-78077fa200ad\") " Apr 22 20:03:30.570422 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:30.570402 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d145cc-a86d-4cc6-9c94-78077fa200ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "14d145cc-a86d-4cc6-9c94-78077fa200ad" (UID: "14d145cc-a86d-4cc6-9c94-78077fa200ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:03:30.671079 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:30.671056 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14d145cc-a86d-4cc6-9c94-78077fa200ad-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:03:31.391912 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.391880 2572 generic.go:358] "Generic (PLEG): container finished" podID="0f6851de-9402-476a-880e-fb3d177629d2" containerID="5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5" exitCode=0 Apr 22 20:03:31.392089 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.391954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" event={"ID":"0f6851de-9402-476a-880e-fb3d177629d2","Type":"ContainerDied","Data":"5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5"} Apr 22 20:03:31.393527 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.393505 2572 generic.go:358] "Generic (PLEG): container finished" podID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerID="1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe" exitCode=0 Apr 22 20:03:31.393631 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.393533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" event={"ID":"14d145cc-a86d-4cc6-9c94-78077fa200ad","Type":"ContainerDied","Data":"1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe"} Apr 22 20:03:31.393631 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.393551 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" event={"ID":"14d145cc-a86d-4cc6-9c94-78077fa200ad","Type":"ContainerDied","Data":"c6efa448b60e0e45d738d1d3c0d133b9cb984a2b3e0e7ea831cf84890a5d0204"} Apr 22 20:03:31.393631 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.393564 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7" Apr 22 20:03:31.393759 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.393566 2572 scope.go:117] "RemoveContainer" containerID="1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe" Apr 22 20:03:31.402271 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.402251 2572 scope.go:117] "RemoveContainer" containerID="0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e" Apr 22 20:03:31.410518 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.410503 2572 scope.go:117] "RemoveContainer" containerID="1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe" Apr 22 20:03:31.410831 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:03:31.410806 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe\": container with ID starting with 1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe not found: ID does not exist" containerID="1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe" Apr 22 20:03:31.410933 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.410842 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe"} err="failed to get container status \"1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe\": rpc error: code = NotFound desc = could not find container \"1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe\": container with ID starting with 1dce129306e733bf54b9d21dd7defca87f246018c63b8b5a0680267c63d246fe not found: ID does not exist" Apr 22 20:03:31.410933 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.410867 2572 scope.go:117] "RemoveContainer" containerID="0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e" Apr 22 20:03:31.411163 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:03:31.411141 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e\": container with ID starting with 0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e not found: ID does not exist" containerID="0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e" Apr 22 20:03:31.411228 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.411168 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e"} err="failed to get container status \"0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e\": rpc error: code = NotFound desc = could not find container \"0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e\": container with ID starting with 0acecd27ea802704f4cd531350f3831a42b18105a7dabce4f879a43ac856363e not found: ID does not exist" Apr 22 20:03:31.420138 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.420116 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7"] Apr 22 20:03:31.423903 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:31.423883 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-h5kh7"] Apr 22 20:03:32.169399 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:32.169370 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" path="/var/lib/kubelet/pods/14d145cc-a86d-4cc6-9c94-78077fa200ad/volumes" Apr 22 20:03:32.397821 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:32.397788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" event={"ID":"0f6851de-9402-476a-880e-fb3d177629d2","Type":"ContainerStarted","Data":"e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150"} Apr 22 20:03:32.398071 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:32.398052 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:03:32.415004 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:03:32.414959 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" podStartSLOduration=6.414945348 podStartE2EDuration="6.414945348s" podCreationTimestamp="2026-04-22 20:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:03:32.413522507 +0000 UTC m=+2426.822937022" watchObservedRunningTime="2026-04-22 20:03:32.414945348 +0000 UTC m=+2426.824359862" Apr 22 20:04:03.403560 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:03.403516 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:04:13.401654 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:13.401606 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:04:23.402546 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:23.402505 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:04:33.401773 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:33.401723 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:04:42.174786 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:42.171825 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:04:46.577119 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.577086 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w"] Apr 22 20:04:46.577518 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.577296 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" containerID="cri-o://e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150" gracePeriod=30 Apr 22 20:04:46.619225 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.619198 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g"] Apr 22 20:04:46.619481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.619469 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" Apr 22 20:04:46.619525 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.619483 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" Apr 22 20:04:46.619525 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.619492 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="storage-initializer" Apr 22 20:04:46.619525 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.619498 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="storage-initializer" Apr 22 20:04:46.619617 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.619544 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="14d145cc-a86d-4cc6-9c94-78077fa200ad" containerName="kserve-container" Apr 22 20:04:46.622760 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.622742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:04:46.632494 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.632474 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g"] Apr 22 20:04:46.777631 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.777602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g\" (UID: \"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:04:46.878168 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.878108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g\" (UID: \"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:04:46.878477 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.878461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g\" (UID: \"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:04:46.932213 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:46.932189 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:04:47.049752 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:47.049609 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g"] Apr 22 20:04:47.052390 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:04:47.052361 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb26ffa_2c5f_4f9e_b8da_e9e4a4792c47.slice/crio-1ce230e8f477526045547d69f1097a9cce38eb7986c3d407b1dfc4c91981a952 WatchSource:0}: Error finding container 1ce230e8f477526045547d69f1097a9cce38eb7986c3d407b1dfc4c91981a952: Status 404 returned error can't find the container with id 1ce230e8f477526045547d69f1097a9cce38eb7986c3d407b1dfc4c91981a952 Apr 22 20:04:47.054229 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:47.054214 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:04:47.609129 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:47.609092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" event={"ID":"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47","Type":"ContainerStarted","Data":"6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb"} Apr 22 20:04:47.609129 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:47.609132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" event={"ID":"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47","Type":"ContainerStarted","Data":"1ce230e8f477526045547d69f1097a9cce38eb7986c3d407b1dfc4c91981a952"} Apr 22 20:04:50.608358 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.608335 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:04:50.618875 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.618851 2572 generic.go:358] "Generic (PLEG): container finished" podID="0f6851de-9402-476a-880e-fb3d177629d2" containerID="e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150" exitCode=0 Apr 22 20:04:50.619011 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.618905 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" Apr 22 20:04:50.619011 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.618932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" event={"ID":"0f6851de-9402-476a-880e-fb3d177629d2","Type":"ContainerDied","Data":"e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150"} Apr 22 20:04:50.619011 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.618976 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w" event={"ID":"0f6851de-9402-476a-880e-fb3d177629d2","Type":"ContainerDied","Data":"92339faaafff23f29487a4310ae16bc183aa226099cddc6eb3a0b6a2d09aa97a"} Apr 22 20:04:50.619011 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.618998 2572 scope.go:117] "RemoveContainer" containerID="e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150" Apr 22 20:04:50.626710 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.626693 2572 scope.go:117] "RemoveContainer" containerID="5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5" Apr 22 20:04:50.633448 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.633430 2572 scope.go:117] "RemoveContainer" containerID="e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150" Apr 22 20:04:50.633715 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:04:50.633697 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150\": container with ID starting with e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150 not found: ID does not exist" containerID="e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150" Apr 22 20:04:50.633811 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.633729 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150"} err="failed to get container status \"e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150\": rpc error: code = NotFound desc = could not find container \"e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150\": container with ID starting with e50ce2e1813620ae9320b3896bc809e7170be0e17b629a3dec80006f51752150 not found: ID does not exist" Apr 22 20:04:50.633811 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.633754 2572 scope.go:117] "RemoveContainer" containerID="5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5" Apr 22 20:04:50.634013 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:04:50.633988 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5\": container with ID starting with 5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5 not found: ID does not exist" containerID="5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5" Apr 22 20:04:50.634070 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.634021 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5"} err="failed to get container status \"5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5\": rpc error: code = NotFound desc = could not find container \"5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5\": container with ID starting with 5903b946b63dfaf06010f6243edb2433f36b01bccf0a080edae6f23a4bdb6cf5 not found: ID does not exist" Apr 22 20:04:50.703447 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.703383 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f6851de-9402-476a-880e-fb3d177629d2-kserve-provision-location\") pod \"0f6851de-9402-476a-880e-fb3d177629d2\" (UID: \"0f6851de-9402-476a-880e-fb3d177629d2\") " Apr 22 20:04:50.703729 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.703710 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6851de-9402-476a-880e-fb3d177629d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0f6851de-9402-476a-880e-fb3d177629d2" (UID: "0f6851de-9402-476a-880e-fb3d177629d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:04:50.804848 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.804829 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f6851de-9402-476a-880e-fb3d177629d2-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:04:50.942082 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.942056 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w"] Apr 22 20:04:50.945391 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:50.945371 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9c97w"] Apr 22 20:04:51.623763 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:51.623734 2572 generic.go:358] "Generic (PLEG): container finished" podID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerID="6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb" exitCode=0 Apr 22 20:04:51.624112 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:51.623809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" event={"ID":"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47","Type":"ContainerDied","Data":"6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb"} Apr 22 20:04:52.169829 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:52.169804 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6851de-9402-476a-880e-fb3d177629d2" path="/var/lib/kubelet/pods/0f6851de-9402-476a-880e-fb3d177629d2/volumes" Apr 22 20:04:52.627783 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:52.627715 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" event={"ID":"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47","Type":"ContainerStarted","Data":"05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431"} Apr 22 20:04:52.628106 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:52.627940 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:04:52.645488 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:04:52.645448 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" podStartSLOduration=6.645435037 podStartE2EDuration="6.645435037s" podCreationTimestamp="2026-04-22 20:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:04:52.643944007 +0000 UTC m=+2507.053358519" watchObservedRunningTime="2026-04-22 20:04:52.645435037 +0000 UTC m=+2507.054849546" Apr 22 20:05:23.632558 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:05:23.632517 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:05:33.631050 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:05:33.631008 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:05:43.631641 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:05:43.631601 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:05:53.631855 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:05:53.631799 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:06:03.635326 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:03.635280 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:06:06.736732 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.736698 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g"] Apr 22 20:06:06.737166 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.737035 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" containerID="cri-o://05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431" gracePeriod=30 Apr 22 20:06:06.780844 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.780820 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df"] Apr 22 20:06:06.781127 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.781115 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" Apr 22 20:06:06.781179 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.781131 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" Apr 22 20:06:06.781179 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.781142 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="storage-initializer" Apr 22 20:06:06.781179 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.781148 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="storage-initializer" Apr 22 20:06:06.781272 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.781207 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f6851de-9402-476a-880e-fb3d177629d2" containerName="kserve-container" Apr 22 20:06:06.784357 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.784333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:06:06.792173 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.792152 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df"] Apr 22 20:06:06.817771 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.817748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400fe68f-c86d-46e8-ab4e-66fa118bc500-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df\" (UID: \"400fe68f-c86d-46e8-ab4e-66fa118bc500\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:06:06.918278 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.918252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400fe68f-c86d-46e8-ab4e-66fa118bc500-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df\" (UID: \"400fe68f-c86d-46e8-ab4e-66fa118bc500\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:06:06.918589 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:06.918569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400fe68f-c86d-46e8-ab4e-66fa118bc500-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df\" (UID: \"400fe68f-c86d-46e8-ab4e-66fa118bc500\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:06:07.093966 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:07.093907 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:06:07.216463 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:07.216443 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df"] Apr 22 20:06:07.218526 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:06:07.218504 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400fe68f_c86d_46e8_ab4e_66fa118bc500.slice/crio-62f2da7fa7e0a9320e416c78cafbe2aa999a15bf31b78bed5f11da2bea2de9d9 WatchSource:0}: Error finding container 62f2da7fa7e0a9320e416c78cafbe2aa999a15bf31b78bed5f11da2bea2de9d9: Status 404 returned error can't find the container with id 62f2da7fa7e0a9320e416c78cafbe2aa999a15bf31b78bed5f11da2bea2de9d9 Apr 22 20:06:07.828625 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:07.828588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" event={"ID":"400fe68f-c86d-46e8-ab4e-66fa118bc500","Type":"ContainerStarted","Data":"2d698e34b8442c7a6bcdace191cde3304391a6aeed6367179b262dcc728a528f"} Apr 22 20:06:07.828625 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:07.828625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" event={"ID":"400fe68f-c86d-46e8-ab4e-66fa118bc500","Type":"ContainerStarted","Data":"62f2da7fa7e0a9320e416c78cafbe2aa999a15bf31b78bed5f11da2bea2de9d9"} Apr 22 20:06:10.679126 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.679104 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:06:10.750031 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.750006 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47-kserve-provision-location\") pod \"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47\" (UID: \"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47\") " Apr 22 20:06:10.750316 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.750297 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" (UID: "9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:06:10.838686 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.838585 2572 generic.go:358] "Generic (PLEG): container finished" podID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerID="05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431" exitCode=0 Apr 22 20:06:10.838686 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.838637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" event={"ID":"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47","Type":"ContainerDied","Data":"05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431"} Apr 22 20:06:10.838889 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.838648 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" Apr 22 20:06:10.838889 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.838698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g" event={"ID":"9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47","Type":"ContainerDied","Data":"1ce230e8f477526045547d69f1097a9cce38eb7986c3d407b1dfc4c91981a952"} Apr 22 20:06:10.838889 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.838715 2572 scope.go:117] "RemoveContainer" containerID="05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431" Apr 22 20:06:10.846057 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.846041 2572 scope.go:117] "RemoveContainer" containerID="6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb" Apr 22 20:06:10.851151 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.851128 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:06:10.853415 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.853387 2572 scope.go:117] "RemoveContainer" containerID="05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431" Apr 22 20:06:10.853652 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:06:10.853634 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431\": container with ID starting with 05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431 not found: ID does not exist" containerID="05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431" Apr 22 20:06:10.853771 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.853688 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431"} err="failed to get container status \"05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431\": rpc error: code = NotFound desc = could not find container \"05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431\": container with ID starting with 05801f2cc71da473717a40b1e59bdeae107a593e2d093c957776a19acaa37431 not found: ID does not exist" Apr 22 20:06:10.853771 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.853707 2572 scope.go:117] "RemoveContainer" containerID="6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb" Apr 22 20:06:10.853959 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:06:10.853943 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb\": container with ID starting with 6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb not found: ID does not exist" containerID="6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb" Apr 22 20:06:10.853998 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.853966 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb"} err="failed to get container status \"6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb\": rpc error: code = NotFound desc = could not find container \"6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb\": container with ID starting with 6fa5c85d13d82217927d918bb7dff250c1e8500407db9598da70ea05ccb8c5cb not found: ID does not exist" Apr 22 20:06:10.862958 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.862938 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g"] Apr 22 20:06:10.868424 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:10.868402 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kq46g"] Apr 22 20:06:11.842516 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:11.842485 2572 generic.go:358] "Generic (PLEG): container finished" podID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerID="2d698e34b8442c7a6bcdace191cde3304391a6aeed6367179b262dcc728a528f" exitCode=0 Apr 22 20:06:11.842995 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:11.842572 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" event={"ID":"400fe68f-c86d-46e8-ab4e-66fa118bc500","Type":"ContainerDied","Data":"2d698e34b8442c7a6bcdace191cde3304391a6aeed6367179b262dcc728a528f"} Apr 22 20:06:12.169257 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:12.169226 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" path="/var/lib/kubelet/pods/9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47/volumes" Apr 22 20:06:12.847678 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:12.847635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" event={"ID":"400fe68f-c86d-46e8-ab4e-66fa118bc500","Type":"ContainerStarted","Data":"ac60016987bce572a6275a8c3e418f35a50076ee303b33b2877391d120d6eb8d"} Apr 22 20:06:12.848053 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:12.847858 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:06:12.864477 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:12.864438 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" podStartSLOduration=6.864425644 podStartE2EDuration="6.864425644s" podCreationTimestamp="2026-04-22 20:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:06:12.863411891 +0000 UTC m=+2587.272826403" watchObservedRunningTime="2026-04-22 20:06:12.864425644 +0000 UTC m=+2587.273840157" Apr 22 20:06:43.852266 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:43.852179 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:06:53.851278 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:06:53.851239 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:07:03.850377 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:03.850333 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:07:13.850638 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:13.850597 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:07:23.854498 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:23.854464 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:07:26.990956 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:26.990927 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df"] Apr 22 20:07:26.991320 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:26.991151 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" containerID="cri-o://ac60016987bce572a6275a8c3e418f35a50076ee303b33b2877391d120d6eb8d" gracePeriod=30 Apr 22 20:07:29.056026 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.055996 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls"] Apr 22 20:07:29.056370 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.056272 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" Apr 22 20:07:29.056370 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.056283 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" Apr 22 20:07:29.056370 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.056300 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="storage-initializer" Apr 22 20:07:29.056370 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.056306 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="storage-initializer" Apr 22 20:07:29.056370 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.056352 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb26ffa-2c5f-4f9e-b8da-e9e4a4792c47" containerName="kserve-container" Apr 22 20:07:29.059375 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.059354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:07:29.072920 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.072897 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls"] Apr 22 20:07:29.181741 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.181713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42efb99-2ad5-4218-a8b8-9fb39b74b36b-kserve-provision-location\") pod \"isvc-sklearn-predictor-84d4647756-p86ls\" (UID: \"a42efb99-2ad5-4218-a8b8-9fb39b74b36b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:07:29.282968 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.282945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42efb99-2ad5-4218-a8b8-9fb39b74b36b-kserve-provision-location\") pod \"isvc-sklearn-predictor-84d4647756-p86ls\" (UID: \"a42efb99-2ad5-4218-a8b8-9fb39b74b36b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:07:29.283273 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.283255 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42efb99-2ad5-4218-a8b8-9fb39b74b36b-kserve-provision-location\") pod \"isvc-sklearn-predictor-84d4647756-p86ls\" (UID: \"a42efb99-2ad5-4218-a8b8-9fb39b74b36b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:07:29.369156 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.369103 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:07:29.486528 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:29.486399 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls"] Apr 22 20:07:29.489160 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:07:29.489128 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42efb99_2ad5_4218_a8b8_9fb39b74b36b.slice/crio-352c3e0816ac3b65ce5c693b9b1b17707f08e2064d6c762e29c785292c240876 WatchSource:0}: Error finding container 352c3e0816ac3b65ce5c693b9b1b17707f08e2064d6c762e29c785292c240876: Status 404 returned error can't find the container with id 352c3e0816ac3b65ce5c693b9b1b17707f08e2064d6c762e29c785292c240876 Apr 22 20:07:30.064933 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:30.064897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" event={"ID":"a42efb99-2ad5-4218-a8b8-9fb39b74b36b","Type":"ContainerStarted","Data":"c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0"} Apr 22 20:07:30.064933 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:30.064935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" event={"ID":"a42efb99-2ad5-4218-a8b8-9fb39b74b36b","Type":"ContainerStarted","Data":"352c3e0816ac3b65ce5c693b9b1b17707f08e2064d6c762e29c785292c240876"} Apr 22 20:07:31.071294 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:31.071224 2572 generic.go:358] "Generic (PLEG): container finished" podID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerID="ac60016987bce572a6275a8c3e418f35a50076ee303b33b2877391d120d6eb8d" exitCode=0 Apr 22 20:07:31.071605 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:31.071304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" event={"ID":"400fe68f-c86d-46e8-ab4e-66fa118bc500","Type":"ContainerDied","Data":"ac60016987bce572a6275a8c3e418f35a50076ee303b33b2877391d120d6eb8d"} Apr 22 20:07:31.422040 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:31.422020 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:07:31.600147 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:31.600119 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400fe68f-c86d-46e8-ab4e-66fa118bc500-kserve-provision-location\") pod \"400fe68f-c86d-46e8-ab4e-66fa118bc500\" (UID: \"400fe68f-c86d-46e8-ab4e-66fa118bc500\") " Apr 22 20:07:31.600456 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:31.600436 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/400fe68f-c86d-46e8-ab4e-66fa118bc500-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "400fe68f-c86d-46e8-ab4e-66fa118bc500" (UID: "400fe68f-c86d-46e8-ab4e-66fa118bc500"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:07:31.701414 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:31.701372 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400fe68f-c86d-46e8-ab4e-66fa118bc500-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:07:32.076036 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.075864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" event={"ID":"400fe68f-c86d-46e8-ab4e-66fa118bc500","Type":"ContainerDied","Data":"62f2da7fa7e0a9320e416c78cafbe2aa999a15bf31b78bed5f11da2bea2de9d9"} Apr 22 20:07:32.076036 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.075904 2572 scope.go:117] "RemoveContainer" containerID="ac60016987bce572a6275a8c3e418f35a50076ee303b33b2877391d120d6eb8d" Apr 22 20:07:32.076036 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.075941 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df" Apr 22 20:07:32.084447 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.084420 2572 scope.go:117] "RemoveContainer" containerID="2d698e34b8442c7a6bcdace191cde3304391a6aeed6367179b262dcc728a528f" Apr 22 20:07:32.097498 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.097474 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df"] Apr 22 20:07:32.100795 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.100770 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-qq2df"] Apr 22 20:07:32.170053 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:32.170027 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" path="/var/lib/kubelet/pods/400fe68f-c86d-46e8-ab4e-66fa118bc500/volumes" Apr 22 20:07:34.083804 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:34.083772 2572 generic.go:358] "Generic (PLEG): container finished" podID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerID="c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0" exitCode=0 Apr 22 20:07:34.084180 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:34.083818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" event={"ID":"a42efb99-2ad5-4218-a8b8-9fb39b74b36b","Type":"ContainerDied","Data":"c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0"} Apr 22 20:07:35.088620 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:35.088589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" event={"ID":"a42efb99-2ad5-4218-a8b8-9fb39b74b36b","Type":"ContainerStarted","Data":"632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26"} Apr 22 20:07:35.089004 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:35.088883 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:07:35.090139 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:35.090112 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:07:35.104541 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:35.104501 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podStartSLOduration=6.104488573 podStartE2EDuration="6.104488573s" podCreationTimestamp="2026-04-22 20:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:07:35.103364324 +0000 UTC m=+2669.512778836" watchObservedRunningTime="2026-04-22 20:07:35.104488573 +0000 UTC m=+2669.513903086" Apr 22 20:07:36.091911 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:36.091873 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:07:46.092447 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:46.092407 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:07:56.092288 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:07:56.092193 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:08:06.092749 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:06.092706 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:08:06.285770 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:06.285747 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:08:06.302047 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:06.302027 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:08:16.092575 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:16.092534 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:08:26.092696 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:26.092642 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:08:36.093782 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:36.093749 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:08:39.226013 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.225986 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls"] Apr 22 20:08:39.226459 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.226245 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" containerID="cri-o://632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26" gracePeriod=30 Apr 22 20:08:39.265995 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.265971 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr"] Apr 22 20:08:39.266281 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.266269 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" Apr 22 20:08:39.266323 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.266283 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" Apr 22 20:08:39.266323 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.266296 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="storage-initializer" Apr 22 20:08:39.266323 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.266302 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="storage-initializer" Apr 22 20:08:39.266419 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.266356 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="400fe68f-c86d-46e8-ab4e-66fa118bc500" containerName="kserve-container" Apr 22 20:08:39.269476 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.269458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:08:39.278092 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.278065 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr"] Apr 22 20:08:39.368256 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.368235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/046e908d-53d6-458c-b489-bec063a06bed-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-jcwjr\" (UID: \"046e908d-53d6-458c-b489-bec063a06bed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:08:39.468792 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.468765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/046e908d-53d6-458c-b489-bec063a06bed-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-jcwjr\" (UID: \"046e908d-53d6-458c-b489-bec063a06bed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:08:39.469074 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.469060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/046e908d-53d6-458c-b489-bec063a06bed-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-jcwjr\" (UID: \"046e908d-53d6-458c-b489-bec063a06bed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:08:39.579944 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.579877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:08:39.697034 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:39.696990 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr"] Apr 22 20:08:39.699740 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:08:39.699714 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046e908d_53d6_458c_b489_bec063a06bed.slice/crio-cbb1cc011ceefc50dceb07eb27d642ea55e84e0517d0366066df85f12ee47fc7 WatchSource:0}: Error finding container cbb1cc011ceefc50dceb07eb27d642ea55e84e0517d0366066df85f12ee47fc7: Status 404 returned error can't find the container with id cbb1cc011ceefc50dceb07eb27d642ea55e84e0517d0366066df85f12ee47fc7 Apr 22 20:08:40.267654 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:40.267618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" event={"ID":"046e908d-53d6-458c-b489-bec063a06bed","Type":"ContainerStarted","Data":"288ba9a5b4016b9b6042c85f1af98fcd2de0037aee1acd6b21c839e2e0841dd7"} Apr 22 20:08:40.267654 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:40.267654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" event={"ID":"046e908d-53d6-458c-b489-bec063a06bed","Type":"ContainerStarted","Data":"cbb1cc011ceefc50dceb07eb27d642ea55e84e0517d0366066df85f12ee47fc7"} Apr 22 20:08:42.958973 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:42.958948 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:08:43.094751 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.094693 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42efb99-2ad5-4218-a8b8-9fb39b74b36b-kserve-provision-location\") pod \"a42efb99-2ad5-4218-a8b8-9fb39b74b36b\" (UID: \"a42efb99-2ad5-4218-a8b8-9fb39b74b36b\") " Apr 22 20:08:43.094971 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.094949 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42efb99-2ad5-4218-a8b8-9fb39b74b36b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a42efb99-2ad5-4218-a8b8-9fb39b74b36b" (UID: "a42efb99-2ad5-4218-a8b8-9fb39b74b36b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:08:43.195116 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.195093 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42efb99-2ad5-4218-a8b8-9fb39b74b36b-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:08:43.277064 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.277035 2572 generic.go:358] "Generic (PLEG): container finished" podID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerID="632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26" exitCode=0 Apr 22 20:08:43.277159 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.277104 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" Apr 22 20:08:43.277159 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.277120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" event={"ID":"a42efb99-2ad5-4218-a8b8-9fb39b74b36b","Type":"ContainerDied","Data":"632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26"} Apr 22 20:08:43.277233 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.277161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls" event={"ID":"a42efb99-2ad5-4218-a8b8-9fb39b74b36b","Type":"ContainerDied","Data":"352c3e0816ac3b65ce5c693b9b1b17707f08e2064d6c762e29c785292c240876"} Apr 22 20:08:43.277233 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.277177 2572 scope.go:117] "RemoveContainer" containerID="632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26" Apr 22 20:08:43.285707 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.285689 2572 scope.go:117] "RemoveContainer" containerID="c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0" Apr 22 20:08:43.292519 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.292502 2572 scope.go:117] "RemoveContainer" containerID="632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26" Apr 22 20:08:43.292770 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:08:43.292751 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26\": container with ID starting with 632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26 not found: ID does not exist" containerID="632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26" Apr 22 20:08:43.292848 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.292782 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26"} err="failed to get container status \"632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26\": rpc error: code = NotFound desc = could not find container \"632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26\": container with ID starting with 632f17df2ce48012d552d9452e6ab983a7d836b9537780ef7d88a15ecdfaef26 not found: ID does not exist" Apr 22 20:08:43.292848 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.292805 2572 scope.go:117] "RemoveContainer" containerID="c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0" Apr 22 20:08:43.293008 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:08:43.292992 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0\": container with ID starting with c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0 not found: ID does not exist" containerID="c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0" Apr 22 20:08:43.293049 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.293014 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0"} err="failed to get container status \"c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0\": rpc error: code = NotFound desc = could not find container \"c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0\": container with ID starting with c7669300e38748fa4efc1529d3c44462ad9fd7706e2fd23864a4cd28df4d66c0 not found: ID does not exist" Apr 22 20:08:43.297983 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.297961 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls"] Apr 22 20:08:43.302003 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:43.301981 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-84d4647756-p86ls"] Apr 22 20:08:44.170423 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:44.170390 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" path="/var/lib/kubelet/pods/a42efb99-2ad5-4218-a8b8-9fb39b74b36b/volumes" Apr 22 20:08:44.280601 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:44.280575 2572 generic.go:358] "Generic (PLEG): container finished" podID="046e908d-53d6-458c-b489-bec063a06bed" containerID="288ba9a5b4016b9b6042c85f1af98fcd2de0037aee1acd6b21c839e2e0841dd7" exitCode=0 Apr 22 20:08:44.280755 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:44.280646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" event={"ID":"046e908d-53d6-458c-b489-bec063a06bed","Type":"ContainerDied","Data":"288ba9a5b4016b9b6042c85f1af98fcd2de0037aee1acd6b21c839e2e0841dd7"} Apr 22 20:08:45.286150 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:45.286118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" event={"ID":"046e908d-53d6-458c-b489-bec063a06bed","Type":"ContainerStarted","Data":"00ca0353bb2a388251131e2c57e4deb177598e5fc558984e90c060c87826e126"} Apr 22 20:08:45.286586 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:45.286342 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:08:45.302578 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:08:45.302526 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" podStartSLOduration=6.302512683 podStartE2EDuration="6.302512683s" podCreationTimestamp="2026-04-22 20:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:45.301064165 +0000 UTC m=+2739.710478690" watchObservedRunningTime="2026-04-22 20:08:45.302512683 +0000 UTC m=+2739.711927196" Apr 22 20:09:16.333449 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:16.333401 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 20:09:26.293983 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:26.293900 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:09:29.365214 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.365184 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr"] Apr 22 20:09:29.365635 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.365427 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="kserve-container" containerID="cri-o://00ca0353bb2a388251131e2c57e4deb177598e5fc558984e90c060c87826e126" gracePeriod=30 Apr 22 20:09:29.406810 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.406781 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx"] Apr 22 20:09:29.407074 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.407063 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="storage-initializer" Apr 22 20:09:29.407144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.407076 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="storage-initializer" Apr 22 20:09:29.407144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.407090 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" Apr 22 20:09:29.407144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.407096 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" Apr 22 20:09:29.407247 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.407148 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42efb99-2ad5-4218-a8b8-9fb39b74b36b" containerName="kserve-container" Apr 22 20:09:29.410094 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.410079 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:09:29.420726 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.420707 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx"] Apr 22 20:09:29.513999 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.513974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a88db47-e69a-4eae-baf3-2de2332e8fd0-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx\" (UID: \"0a88db47-e69a-4eae-baf3-2de2332e8fd0\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:09:29.615031 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.615004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a88db47-e69a-4eae-baf3-2de2332e8fd0-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx\" (UID: \"0a88db47-e69a-4eae-baf3-2de2332e8fd0\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:09:29.615376 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.615336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a88db47-e69a-4eae-baf3-2de2332e8fd0-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx\" (UID: \"0a88db47-e69a-4eae-baf3-2de2332e8fd0\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:09:29.720083 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.720062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:09:29.836260 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:29.836236 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx"] Apr 22 20:09:29.839036 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:09:29.839007 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a88db47_e69a_4eae_baf3_2de2332e8fd0.slice/crio-c017dc4739404e48cb2943676617ea3c31ed8019f5062adbeb03c2e9dd681093 WatchSource:0}: Error finding container c017dc4739404e48cb2943676617ea3c31ed8019f5062adbeb03c2e9dd681093: Status 404 returned error can't find the container with id c017dc4739404e48cb2943676617ea3c31ed8019f5062adbeb03c2e9dd681093 Apr 22 20:09:30.407598 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:30.407558 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" event={"ID":"0a88db47-e69a-4eae-baf3-2de2332e8fd0","Type":"ContainerStarted","Data":"6fc5e1799a063f7cd34f248d4e9f0fa792cd99edc810ea88b70ce6b66e16cf2a"} Apr 22 20:09:30.407598 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:30.407603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" event={"ID":"0a88db47-e69a-4eae-baf3-2de2332e8fd0","Type":"ContainerStarted","Data":"c017dc4739404e48cb2943676617ea3c31ed8019f5062adbeb03c2e9dd681093"} Apr 22 20:09:35.424394 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.424358 2572 generic.go:358] "Generic (PLEG): container finished" podID="046e908d-53d6-458c-b489-bec063a06bed" containerID="00ca0353bb2a388251131e2c57e4deb177598e5fc558984e90c060c87826e126" exitCode=0 Apr 22 20:09:35.424803 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.424425 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" event={"ID":"046e908d-53d6-458c-b489-bec063a06bed","Type":"ContainerDied","Data":"00ca0353bb2a388251131e2c57e4deb177598e5fc558984e90c060c87826e126"} Apr 22 20:09:35.425803 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.425784 2572 generic.go:358] "Generic (PLEG): container finished" podID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerID="6fc5e1799a063f7cd34f248d4e9f0fa792cd99edc810ea88b70ce6b66e16cf2a" exitCode=0 Apr 22 20:09:35.425894 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.425808 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" event={"ID":"0a88db47-e69a-4eae-baf3-2de2332e8fd0","Type":"ContainerDied","Data":"6fc5e1799a063f7cd34f248d4e9f0fa792cd99edc810ea88b70ce6b66e16cf2a"} Apr 22 20:09:35.506327 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.506300 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:09:35.657929 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.657887 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/046e908d-53d6-458c-b489-bec063a06bed-kserve-provision-location\") pod \"046e908d-53d6-458c-b489-bec063a06bed\" (UID: \"046e908d-53d6-458c-b489-bec063a06bed\") " Apr 22 20:09:35.658179 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.658158 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046e908d-53d6-458c-b489-bec063a06bed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "046e908d-53d6-458c-b489-bec063a06bed" (UID: "046e908d-53d6-458c-b489-bec063a06bed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:35.758971 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:35.758935 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/046e908d-53d6-458c-b489-bec063a06bed-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:09:36.429923 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.429833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" event={"ID":"046e908d-53d6-458c-b489-bec063a06bed","Type":"ContainerDied","Data":"cbb1cc011ceefc50dceb07eb27d642ea55e84e0517d0366066df85f12ee47fc7"} Apr 22 20:09:36.429923 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.429868 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr" Apr 22 20:09:36.429923 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.429879 2572 scope.go:117] "RemoveContainer" containerID="00ca0353bb2a388251131e2c57e4deb177598e5fc558984e90c060c87826e126" Apr 22 20:09:36.432014 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.431987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" event={"ID":"0a88db47-e69a-4eae-baf3-2de2332e8fd0","Type":"ContainerStarted","Data":"c87326de73bf8d3c5263b359e8b12e573177f62edfbdbe6b0ad7440b4a30eb02"} Apr 22 20:09:36.432270 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.432253 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:09:36.433898 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.433872 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 22 20:09:36.439535 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.439516 2572 scope.go:117] "RemoveContainer" containerID="288ba9a5b4016b9b6042c85f1af98fcd2de0037aee1acd6b21c839e2e0841dd7" Apr 22 20:09:36.445576 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.445555 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr"] Apr 22 20:09:36.449875 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.449854 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jcwjr"] Apr 22 20:09:36.463650 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:36.463610 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" podStartSLOduration=7.463600485 podStartE2EDuration="7.463600485s" podCreationTimestamp="2026-04-22 20:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:36.462348693 +0000 UTC m=+2790.871763217" watchObservedRunningTime="2026-04-22 20:09:36.463600485 +0000 UTC m=+2790.873014998" Apr 22 20:09:37.436807 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:37.436768 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 22 20:09:38.169952 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:38.169920 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046e908d-53d6-458c-b489-bec063a06bed" path="/var/lib/kubelet/pods/046e908d-53d6-458c-b489-bec063a06bed/volumes" Apr 22 20:09:47.436768 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:47.436728 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 22 20:09:57.438502 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:09:57.438469 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:10:06.367585 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.367559 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx_0a88db47-e69a-4eae-baf3-2de2332e8fd0/kserve-container/0.log" Apr 22 20:10:06.520335 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.520308 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx"] Apr 22 20:10:06.520635 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.520607 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" containerID="cri-o://c87326de73bf8d3c5263b359e8b12e573177f62edfbdbe6b0ad7440b4a30eb02" gracePeriod=30 Apr 22 20:10:06.571472 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.571442 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t"] Apr 22 20:10:06.571775 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.571762 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="storage-initializer" Apr 22 20:10:06.571839 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.571776 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="storage-initializer" Apr 22 20:10:06.571839 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.571787 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="kserve-container" Apr 22 20:10:06.571839 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.571793 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="kserve-container" Apr 22 20:10:06.571839 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.571838 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="046e908d-53d6-458c-b489-bec063a06bed" containerName="kserve-container" Apr 22 20:10:06.573772 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.573755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:06.584640 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.584617 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t"] Apr 22 20:10:06.686210 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.686189 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/778b70cd-aa27-483d-afa1-51cfe1d1a418-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t\" (UID: \"778b70cd-aa27-483d-afa1-51cfe1d1a418\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:06.786691 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.786653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/778b70cd-aa27-483d-afa1-51cfe1d1a418-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t\" (UID: \"778b70cd-aa27-483d-afa1-51cfe1d1a418\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:06.786951 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.786935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/778b70cd-aa27-483d-afa1-51cfe1d1a418-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t\" (UID: \"778b70cd-aa27-483d-afa1-51cfe1d1a418\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:06.884037 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:06.884022 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:07.006582 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.006482 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t"] Apr 22 20:10:07.009059 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:10:07.009027 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod778b70cd_aa27_483d_afa1_51cfe1d1a418.slice/crio-4c90a5cf12ba339dccb5cd584014ba39ad66d2725263381de939c73fe7471b26 WatchSource:0}: Error finding container 4c90a5cf12ba339dccb5cd584014ba39ad66d2725263381de939c73fe7471b26: Status 404 returned error can't find the container with id 4c90a5cf12ba339dccb5cd584014ba39ad66d2725263381de939c73fe7471b26 Apr 22 20:10:07.010935 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.010919 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:10:07.437058 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.437026 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 22 20:10:07.531816 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.531776 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" event={"ID":"778b70cd-aa27-483d-afa1-51cfe1d1a418","Type":"ContainerStarted","Data":"a4663196e87b741a05b5c24204f7aeab7ed59a1fd43d9346839e187a3a8da67c"} Apr 22 20:10:07.531816 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.531818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" event={"ID":"778b70cd-aa27-483d-afa1-51cfe1d1a418","Type":"ContainerStarted","Data":"4c90a5cf12ba339dccb5cd584014ba39ad66d2725263381de939c73fe7471b26"} Apr 22 20:10:07.533553 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.533529 2572 generic.go:358] "Generic (PLEG): container finished" podID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerID="c87326de73bf8d3c5263b359e8b12e573177f62edfbdbe6b0ad7440b4a30eb02" exitCode=0 Apr 22 20:10:07.533698 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.533576 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" event={"ID":"0a88db47-e69a-4eae-baf3-2de2332e8fd0","Type":"ContainerDied","Data":"c87326de73bf8d3c5263b359e8b12e573177f62edfbdbe6b0ad7440b4a30eb02"} Apr 22 20:10:07.545770 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.545749 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:10:07.592891 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.592860 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a88db47-e69a-4eae-baf3-2de2332e8fd0-kserve-provision-location\") pod \"0a88db47-e69a-4eae-baf3-2de2332e8fd0\" (UID: \"0a88db47-e69a-4eae-baf3-2de2332e8fd0\") " Apr 22 20:10:07.615947 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.615894 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a88db47-e69a-4eae-baf3-2de2332e8fd0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a88db47-e69a-4eae-baf3-2de2332e8fd0" (UID: "0a88db47-e69a-4eae-baf3-2de2332e8fd0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:07.693936 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:07.693898 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a88db47-e69a-4eae-baf3-2de2332e8fd0-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:10:08.537086 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:08.537026 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" Apr 22 20:10:08.537086 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:08.537021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx" event={"ID":"0a88db47-e69a-4eae-baf3-2de2332e8fd0","Type":"ContainerDied","Data":"c017dc4739404e48cb2943676617ea3c31ed8019f5062adbeb03c2e9dd681093"} Apr 22 20:10:08.537086 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:08.537084 2572 scope.go:117] "RemoveContainer" containerID="c87326de73bf8d3c5263b359e8b12e573177f62edfbdbe6b0ad7440b4a30eb02" Apr 22 20:10:08.544628 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:08.544605 2572 scope.go:117] "RemoveContainer" containerID="6fc5e1799a063f7cd34f248d4e9f0fa792cd99edc810ea88b70ce6b66e16cf2a" Apr 22 20:10:08.553408 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:08.553388 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx"] Apr 22 20:10:08.557491 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:08.557469 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6b6c7d8cdb-g84hx"] Apr 22 20:10:10.169748 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:10.169718 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" path="/var/lib/kubelet/pods/0a88db47-e69a-4eae-baf3-2de2332e8fd0/volumes" Apr 22 20:10:11.547572 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:11.547541 2572 generic.go:358] "Generic (PLEG): container finished" podID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerID="a4663196e87b741a05b5c24204f7aeab7ed59a1fd43d9346839e187a3a8da67c" exitCode=0 Apr 22 20:10:11.547952 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:11.547610 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" event={"ID":"778b70cd-aa27-483d-afa1-51cfe1d1a418","Type":"ContainerDied","Data":"a4663196e87b741a05b5c24204f7aeab7ed59a1fd43d9346839e187a3a8da67c"} Apr 22 20:10:12.551996 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:12.551965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" event={"ID":"778b70cd-aa27-483d-afa1-51cfe1d1a418","Type":"ContainerStarted","Data":"000c45bc75ae606d660b28071d893205ebb945429c3a51d6cd05cba5b5d607e2"} Apr 22 20:10:12.552396 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:12.552164 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:12.568507 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:12.568468 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" podStartSLOduration=6.568454407 podStartE2EDuration="6.568454407s" podCreationTimestamp="2026-04-22 20:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:10:12.566749192 +0000 UTC m=+2826.976163705" watchObservedRunningTime="2026-04-22 20:10:12.568454407 +0000 UTC m=+2826.977868919" Apr 22 20:10:43.632761 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:43.632709 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 20:10:53.560421 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:53.560389 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:10:56.665892 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.665811 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t"] Apr 22 20:10:56.666248 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.666129 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="kserve-container" containerID="cri-o://000c45bc75ae606d660b28071d893205ebb945429c3a51d6cd05cba5b5d607e2" gracePeriod=30 Apr 22 20:10:56.701211 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.701176 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f"] Apr 22 20:10:56.701469 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.701457 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" Apr 22 20:10:56.701534 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.701471 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" Apr 22 20:10:56.701534 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.701490 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="storage-initializer" Apr 22 20:10:56.701534 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.701496 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="storage-initializer" Apr 22 20:10:56.701637 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.701545 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a88db47-e69a-4eae-baf3-2de2332e8fd0" containerName="kserve-container" Apr 22 20:10:56.703272 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.703257 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:10:56.711405 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.711384 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f"] Apr 22 20:10:56.821852 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.821827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5603b95a-38c3-45e9-939f-0fc181f832eb-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-c9cd7bf64-45d5f\" (UID: \"5603b95a-38c3-45e9-939f-0fc181f832eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:10:56.923137 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.923082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5603b95a-38c3-45e9-939f-0fc181f832eb-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-c9cd7bf64-45d5f\" (UID: \"5603b95a-38c3-45e9-939f-0fc181f832eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:10:56.923376 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:56.923361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5603b95a-38c3-45e9-939f-0fc181f832eb-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-c9cd7bf64-45d5f\" (UID: \"5603b95a-38c3-45e9-939f-0fc181f832eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:10:57.012833 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:57.012807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:10:57.127495 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:57.127474 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f"] Apr 22 20:10:57.129829 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:10:57.129800 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5603b95a_38c3_45e9_939f_0fc181f832eb.slice/crio-3efdd0c8965523d445f1d1a1e7b4f52495980f1c6499d79427e7db88a858f863 WatchSource:0}: Error finding container 3efdd0c8965523d445f1d1a1e7b4f52495980f1c6499d79427e7db88a858f863: Status 404 returned error can't find the container with id 3efdd0c8965523d445f1d1a1e7b4f52495980f1c6499d79427e7db88a858f863 Apr 22 20:10:57.686357 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:57.686314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" event={"ID":"5603b95a-38c3-45e9-939f-0fc181f832eb","Type":"ContainerStarted","Data":"b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926"} Apr 22 20:10:57.686840 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:10:57.686365 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" event={"ID":"5603b95a-38c3-45e9-939f-0fc181f832eb","Type":"ContainerStarted","Data":"3efdd0c8965523d445f1d1a1e7b4f52495980f1c6499d79427e7db88a858f863"} Apr 22 20:11:01.699722 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:01.699687 2572 generic.go:358] "Generic (PLEG): container finished" podID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerID="b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926" exitCode=0 Apr 22 20:11:01.700140 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:01.699759 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" event={"ID":"5603b95a-38c3-45e9-939f-0fc181f832eb","Type":"ContainerDied","Data":"b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926"} Apr 22 20:11:02.705974 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.705947 2572 generic.go:358] "Generic (PLEG): container finished" podID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerID="000c45bc75ae606d660b28071d893205ebb945429c3a51d6cd05cba5b5d607e2" exitCode=0 Apr 22 20:11:02.706362 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.706011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" event={"ID":"778b70cd-aa27-483d-afa1-51cfe1d1a418","Type":"ContainerDied","Data":"000c45bc75ae606d660b28071d893205ebb945429c3a51d6cd05cba5b5d607e2"} Apr 22 20:11:02.707793 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.707770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" event={"ID":"5603b95a-38c3-45e9-939f-0fc181f832eb","Type":"ContainerStarted","Data":"02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c"} Apr 22 20:11:02.708084 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.708064 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:11:02.709615 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.709495 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:11:02.727645 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.727609 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podStartSLOduration=6.727596845 podStartE2EDuration="6.727596845s" podCreationTimestamp="2026-04-22 20:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:11:02.724616962 +0000 UTC m=+2877.134031476" watchObservedRunningTime="2026-04-22 20:11:02.727596845 +0000 UTC m=+2877.137011358" Apr 22 20:11:02.804330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.804311 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:11:02.964909 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.964886 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/778b70cd-aa27-483d-afa1-51cfe1d1a418-kserve-provision-location\") pod \"778b70cd-aa27-483d-afa1-51cfe1d1a418\" (UID: \"778b70cd-aa27-483d-afa1-51cfe1d1a418\") " Apr 22 20:11:02.965180 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:02.965159 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/778b70cd-aa27-483d-afa1-51cfe1d1a418-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "778b70cd-aa27-483d-afa1-51cfe1d1a418" (UID: "778b70cd-aa27-483d-afa1-51cfe1d1a418"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:03.066132 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.066109 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/778b70cd-aa27-483d-afa1-51cfe1d1a418-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:11:03.714353 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.714323 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" Apr 22 20:11:03.714822 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.714345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t" event={"ID":"778b70cd-aa27-483d-afa1-51cfe1d1a418","Type":"ContainerDied","Data":"4c90a5cf12ba339dccb5cd584014ba39ad66d2725263381de939c73fe7471b26"} Apr 22 20:11:03.714822 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.714398 2572 scope.go:117] "RemoveContainer" containerID="000c45bc75ae606d660b28071d893205ebb945429c3a51d6cd05cba5b5d607e2" Apr 22 20:11:03.714822 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.714717 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:11:03.722185 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.722044 2572 scope.go:117] "RemoveContainer" containerID="a4663196e87b741a05b5c24204f7aeab7ed59a1fd43d9346839e187a3a8da67c" Apr 22 20:11:03.741132 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.741109 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t"] Apr 22 20:11:03.744125 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:03.744105 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-nrm6t"] Apr 22 20:11:04.169344 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:04.169313 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" path="/var/lib/kubelet/pods/778b70cd-aa27-483d-afa1-51cfe1d1a418/volumes" Apr 22 20:11:13.715130 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:13.715092 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:11:23.714685 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:23.714630 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:11:33.714856 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:33.714816 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:11:43.715642 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:43.715602 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:11:53.715109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:11:53.715066 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 22 20:12:03.715747 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:03.715708 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:12:06.991979 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:06.991949 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f"] Apr 22 20:12:06.992478 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:06.992177 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" containerID="cri-o://02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c" gracePeriod=30 Apr 22 20:12:07.024932 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.024902 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n"] Apr 22 20:12:07.025183 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.025171 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="kserve-container" Apr 22 20:12:07.025229 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.025184 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="kserve-container" Apr 22 20:12:07.025229 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.025195 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="storage-initializer" Apr 22 20:12:07.025229 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.025200 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="storage-initializer" Apr 22 20:12:07.025321 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.025252 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="778b70cd-aa27-483d-afa1-51cfe1d1a418" containerName="kserve-container" Apr 22 20:12:07.028127 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.028111 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:12:07.036115 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.036091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n"] Apr 22 20:12:07.113274 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.113253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0251522-8b9e-47b8-aa07-1ff8fb56b267-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n\" (UID: \"d0251522-8b9e-47b8-aa07-1ff8fb56b267\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:12:07.214109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.214082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0251522-8b9e-47b8-aa07-1ff8fb56b267-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n\" (UID: \"d0251522-8b9e-47b8-aa07-1ff8fb56b267\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:12:07.214402 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.214386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0251522-8b9e-47b8-aa07-1ff8fb56b267-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n\" (UID: \"d0251522-8b9e-47b8-aa07-1ff8fb56b267\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:12:07.338424 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.338351 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:12:07.451369 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.451332 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n"] Apr 22 20:12:07.455144 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:12:07.455118 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0251522_8b9e_47b8_aa07_1ff8fb56b267.slice/crio-7966388a2d961f0da5a490d640eec3564dbcd1575ed028f34405ded9eb4ce21e WatchSource:0}: Error finding container 7966388a2d961f0da5a490d640eec3564dbcd1575ed028f34405ded9eb4ce21e: Status 404 returned error can't find the container with id 7966388a2d961f0da5a490d640eec3564dbcd1575ed028f34405ded9eb4ce21e Apr 22 20:12:07.896368 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.896333 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" event={"ID":"d0251522-8b9e-47b8-aa07-1ff8fb56b267","Type":"ContainerStarted","Data":"dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293"} Apr 22 20:12:07.896527 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:07.896378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" event={"ID":"d0251522-8b9e-47b8-aa07-1ff8fb56b267","Type":"ContainerStarted","Data":"7966388a2d961f0da5a490d640eec3564dbcd1575ed028f34405ded9eb4ce21e"} Apr 22 20:12:10.727931 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.727909 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:12:10.842759 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.842693 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5603b95a-38c3-45e9-939f-0fc181f832eb-kserve-provision-location\") pod \"5603b95a-38c3-45e9-939f-0fc181f832eb\" (UID: \"5603b95a-38c3-45e9-939f-0fc181f832eb\") " Apr 22 20:12:10.842962 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.842941 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5603b95a-38c3-45e9-939f-0fc181f832eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5603b95a-38c3-45e9-939f-0fc181f832eb" (UID: "5603b95a-38c3-45e9-939f-0fc181f832eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:12:10.906154 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.906127 2572 generic.go:358] "Generic (PLEG): container finished" podID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerID="02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c" exitCode=0 Apr 22 20:12:10.906254 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.906191 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" Apr 22 20:12:10.906318 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.906192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" event={"ID":"5603b95a-38c3-45e9-939f-0fc181f832eb","Type":"ContainerDied","Data":"02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c"} Apr 22 20:12:10.906318 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.906297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f" event={"ID":"5603b95a-38c3-45e9-939f-0fc181f832eb","Type":"ContainerDied","Data":"3efdd0c8965523d445f1d1a1e7b4f52495980f1c6499d79427e7db88a858f863"} Apr 22 20:12:10.906420 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.906322 2572 scope.go:117] "RemoveContainer" containerID="02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c" Apr 22 20:12:10.916032 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.915909 2572 scope.go:117] "RemoveContainer" containerID="b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926" Apr 22 20:12:10.923303 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.923284 2572 scope.go:117] "RemoveContainer" containerID="02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c" Apr 22 20:12:10.923554 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:12:10.923536 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c\": container with ID starting with 02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c not found: ID does not exist" containerID="02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c" Apr 22 20:12:10.923597 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.923564 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c"} err="failed to get container status \"02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c\": rpc error: code = NotFound desc = could not find container \"02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c\": container with ID starting with 02fa4b3693fc60e863c295a7868a4e7c4a84d025c646527c6758bccbbaf3511c not found: ID does not exist" Apr 22 20:12:10.923597 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.923581 2572 scope.go:117] "RemoveContainer" containerID="b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926" Apr 22 20:12:10.923835 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:12:10.923817 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926\": container with ID starting with b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926 not found: ID does not exist" containerID="b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926" Apr 22 20:12:10.923894 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.923841 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926"} err="failed to get container status \"b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926\": rpc error: code = NotFound desc = could not find container \"b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926\": container with ID starting with b972c2337d8420f3a4a1c93989bdb4d305ea59cba1745262bd5535199eeaa926 not found: ID does not exist" Apr 22 20:12:10.929260 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.929237 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f"] Apr 22 20:12:10.934484 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.934466 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-c9cd7bf64-45d5f"] Apr 22 20:12:10.943819 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:10.943801 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5603b95a-38c3-45e9-939f-0fc181f832eb-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:12:11.910912 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:11.910880 2572 generic.go:358] "Generic (PLEG): container finished" podID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerID="dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293" exitCode=0 Apr 22 20:12:11.911276 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:11.910955 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" event={"ID":"d0251522-8b9e-47b8-aa07-1ff8fb56b267","Type":"ContainerDied","Data":"dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293"} Apr 22 20:12:12.170138 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:12.170060 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" path="/var/lib/kubelet/pods/5603b95a-38c3-45e9-939f-0fc181f832eb/volumes" Apr 22 20:12:12.914931 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:12.914899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" event={"ID":"d0251522-8b9e-47b8-aa07-1ff8fb56b267","Type":"ContainerStarted","Data":"c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe"} Apr 22 20:12:12.915396 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:12.915217 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:12:12.916308 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:12.916278 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:12:12.932875 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:12.932828 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podStartSLOduration=5.932814589 podStartE2EDuration="5.932814589s" podCreationTimestamp="2026-04-22 20:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:12:12.93050204 +0000 UTC m=+2947.339916553" watchObservedRunningTime="2026-04-22 20:12:12.932814589 +0000 UTC m=+2947.342229165" Apr 22 20:12:13.918271 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:13.918234 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:12:23.919209 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:23.919167 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:12:33.919031 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:33.918941 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:12:43.918554 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:43.918514 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:12:53.918899 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:12:53.918857 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:13:03.918817 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:03.918768 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 22 20:13:06.305903 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:06.305875 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:13:06.323526 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:06.323500 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:13:13.919703 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:13.919647 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:13:17.176413 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.176382 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n"] Apr 22 20:13:17.176906 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.176615 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" containerID="cri-o://c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe" gracePeriod=30 Apr 22 20:13:17.241078 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.241048 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p"] Apr 22 20:13:17.241352 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.241340 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="storage-initializer" Apr 22 20:13:17.241403 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.241354 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="storage-initializer" Apr 22 20:13:17.241403 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.241377 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" Apr 22 20:13:17.241403 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.241391 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" Apr 22 20:13:17.241502 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.241446 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5603b95a-38c3-45e9-939f-0fc181f832eb" containerName="kserve-container" Apr 22 20:13:17.244398 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.244373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:17.254860 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.254840 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p"] Apr 22 20:13:17.388159 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.388118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab6bd68-5878-47db-b45a-1e84ee2fe52e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-ln25p\" (UID: \"cab6bd68-5878-47db-b45a-1e84ee2fe52e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:17.489507 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.489441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab6bd68-5878-47db-b45a-1e84ee2fe52e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-ln25p\" (UID: \"cab6bd68-5878-47db-b45a-1e84ee2fe52e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:17.489823 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.489803 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab6bd68-5878-47db-b45a-1e84ee2fe52e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-ln25p\" (UID: \"cab6bd68-5878-47db-b45a-1e84ee2fe52e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:17.554722 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.554698 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:17.672288 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:17.672264 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p"] Apr 22 20:13:17.674596 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:13:17.674566 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcab6bd68_5878_47db_b45a_1e84ee2fe52e.slice/crio-4bf806086a84bf63133e536948b821789abe056a9d11ac309abbd5a248416dda WatchSource:0}: Error finding container 4bf806086a84bf63133e536948b821789abe056a9d11ac309abbd5a248416dda: Status 404 returned error can't find the container with id 4bf806086a84bf63133e536948b821789abe056a9d11ac309abbd5a248416dda Apr 22 20:13:18.094293 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:18.094259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" event={"ID":"cab6bd68-5878-47db-b45a-1e84ee2fe52e","Type":"ContainerStarted","Data":"6972298233f4d0a6e5672d6cef4e17eaadb8c9ba2a3b4cd077558d773e3667f5"} Apr 22 20:13:18.094293 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:18.094293 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" event={"ID":"cab6bd68-5878-47db-b45a-1e84ee2fe52e","Type":"ContainerStarted","Data":"4bf806086a84bf63133e536948b821789abe056a9d11ac309abbd5a248416dda"} Apr 22 20:13:21.007310 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.007284 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:13:21.104555 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.104491 2572 generic.go:358] "Generic (PLEG): container finished" podID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerID="c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe" exitCode=0 Apr 22 20:13:21.104707 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.104555 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" Apr 22 20:13:21.104707 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.104581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" event={"ID":"d0251522-8b9e-47b8-aa07-1ff8fb56b267","Type":"ContainerDied","Data":"c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe"} Apr 22 20:13:21.104707 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.104622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n" event={"ID":"d0251522-8b9e-47b8-aa07-1ff8fb56b267","Type":"ContainerDied","Data":"7966388a2d961f0da5a490d640eec3564dbcd1575ed028f34405ded9eb4ce21e"} Apr 22 20:13:21.104707 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.104643 2572 scope.go:117] "RemoveContainer" containerID="c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe" Apr 22 20:13:21.112433 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.112416 2572 scope.go:117] "RemoveContainer" containerID="dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293" Apr 22 20:13:21.114464 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.114440 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0251522-8b9e-47b8-aa07-1ff8fb56b267-kserve-provision-location\") pod \"d0251522-8b9e-47b8-aa07-1ff8fb56b267\" (UID: \"d0251522-8b9e-47b8-aa07-1ff8fb56b267\") " Apr 22 20:13:21.114819 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.114790 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0251522-8b9e-47b8-aa07-1ff8fb56b267-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d0251522-8b9e-47b8-aa07-1ff8fb56b267" (UID: "d0251522-8b9e-47b8-aa07-1ff8fb56b267"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:13:21.121059 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.121038 2572 scope.go:117] "RemoveContainer" containerID="c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe" Apr 22 20:13:21.121417 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:13:21.121393 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe\": container with ID starting with c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe not found: ID does not exist" containerID="c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe" Apr 22 20:13:21.121488 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.121427 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe"} err="failed to get container status \"c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe\": rpc error: code = NotFound desc = could not find container \"c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe\": container with ID starting with c61a7b7f9889c8bd2a63081d9ee874008a44f70688bc82cdcda83d17be203abe not found: ID does not exist" Apr 22 20:13:21.121488 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.121451 2572 scope.go:117] "RemoveContainer" containerID="dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293" Apr 22 20:13:21.121905 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:13:21.121881 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293\": container with ID starting with dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293 not found: ID does not exist" containerID="dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293" Apr 22 20:13:21.121987 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.121912 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293"} err="failed to get container status \"dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293\": rpc error: code = NotFound desc = could not find container \"dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293\": container with ID starting with dceb3b9155013bad46917858c5cfc0e23554be0265f0551ffb1364907dbfe293 not found: ID does not exist" Apr 22 20:13:21.215441 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.215419 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0251522-8b9e-47b8-aa07-1ff8fb56b267-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:13:21.427334 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.427313 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n"] Apr 22 20:13:21.430695 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:21.430656 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-f9fd7d99-qxn8n"] Apr 22 20:13:22.109144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:22.109117 2572 generic.go:358] "Generic (PLEG): container finished" podID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerID="6972298233f4d0a6e5672d6cef4e17eaadb8c9ba2a3b4cd077558d773e3667f5" exitCode=0 Apr 22 20:13:22.109467 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:22.109182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" event={"ID":"cab6bd68-5878-47db-b45a-1e84ee2fe52e","Type":"ContainerDied","Data":"6972298233f4d0a6e5672d6cef4e17eaadb8c9ba2a3b4cd077558d773e3667f5"} Apr 22 20:13:22.169709 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:22.169685 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" path="/var/lib/kubelet/pods/d0251522-8b9e-47b8-aa07-1ff8fb56b267/volumes" Apr 22 20:13:26.127508 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:26.127484 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" event={"ID":"cab6bd68-5878-47db-b45a-1e84ee2fe52e","Type":"ContainerStarted","Data":"ee4cff1b31da604548f197d878286e5f679c74b513ae2d2c7bc280bb590ba65b"} Apr 22 20:13:26.127864 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:26.127846 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:26.128871 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:26.128846 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 22 20:13:26.145571 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:26.145532 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" podStartSLOduration=5.288951923 podStartE2EDuration="9.14551981s" podCreationTimestamp="2026-04-22 20:13:17 +0000 UTC" firstStartedPulling="2026-04-22 20:13:22.1102684 +0000 UTC m=+3016.519682891" lastFinishedPulling="2026-04-22 20:13:25.966836287 +0000 UTC m=+3020.376250778" observedRunningTime="2026-04-22 20:13:26.144520384 +0000 UTC m=+3020.553934911" watchObservedRunningTime="2026-04-22 20:13:26.14551981 +0000 UTC m=+3020.554934322" Apr 22 20:13:27.130323 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:27.130284 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 22 20:13:37.131590 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:37.131551 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:13:58.061553 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.061463 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p"] Apr 22 20:13:58.062095 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.061833 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="kserve-container" containerID="cri-o://ee4cff1b31da604548f197d878286e5f679c74b513ae2d2c7bc280bb590ba65b" gracePeriod=30 Apr 22 20:13:58.128202 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.128171 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6"] Apr 22 20:13:58.128451 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.128440 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="storage-initializer" Apr 22 20:13:58.128496 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.128452 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="storage-initializer" Apr 22 20:13:58.128496 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.128463 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" Apr 22 20:13:58.128496 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.128469 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" Apr 22 20:13:58.128592 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.128523 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0251522-8b9e-47b8-aa07-1ff8fb56b267" containerName="kserve-container" Apr 22 20:13:58.130366 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.130350 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:13:58.139262 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.139235 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6"] Apr 22 20:13:58.186642 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.186614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/241f9e65-8a4a-454a-a44e-c191c97aa38e-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6\" (UID: \"241f9e65-8a4a-454a-a44e-c191c97aa38e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:13:58.287728 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.287705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/241f9e65-8a4a-454a-a44e-c191c97aa38e-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6\" (UID: \"241f9e65-8a4a-454a-a44e-c191c97aa38e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:13:58.288036 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.288018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/241f9e65-8a4a-454a-a44e-c191c97aa38e-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6\" (UID: \"241f9e65-8a4a-454a-a44e-c191c97aa38e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:13:58.441593 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.441572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:13:58.556100 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:58.556063 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6"] Apr 22 20:13:58.558853 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:13:58.558814 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241f9e65_8a4a_454a_a44e_c191c97aa38e.slice/crio-8a1c16af6867869c6d9fa3a27c701f1a74076c3853bd718564bcc15703c43a8c WatchSource:0}: Error finding container 8a1c16af6867869c6d9fa3a27c701f1a74076c3853bd718564bcc15703c43a8c: Status 404 returned error can't find the container with id 8a1c16af6867869c6d9fa3a27c701f1a74076c3853bd718564bcc15703c43a8c Apr 22 20:13:59.223879 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:59.223839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" event={"ID":"241f9e65-8a4a-454a-a44e-c191c97aa38e","Type":"ContainerStarted","Data":"9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469"} Apr 22 20:13:59.223879 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:13:59.223880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" event={"ID":"241f9e65-8a4a-454a-a44e-c191c97aa38e","Type":"ContainerStarted","Data":"8a1c16af6867869c6d9fa3a27c701f1a74076c3853bd718564bcc15703c43a8c"} Apr 22 20:14:03.234646 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:03.234618 2572 generic.go:358] "Generic (PLEG): container finished" podID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerID="9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469" exitCode=0 Apr 22 20:14:03.234947 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:03.234694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" event={"ID":"241f9e65-8a4a-454a-a44e-c191c97aa38e","Type":"ContainerDied","Data":"9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469"} Apr 22 20:14:04.239491 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:04.239457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" event={"ID":"241f9e65-8a4a-454a-a44e-c191c97aa38e","Type":"ContainerStarted","Data":"b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba"} Apr 22 20:14:04.239886 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:04.239759 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:14:04.241151 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:04.241121 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 22 20:14:04.255170 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:04.255131 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" podStartSLOduration=6.255120377 podStartE2EDuration="6.255120377s" podCreationTimestamp="2026-04-22 20:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:14:04.253901044 +0000 UTC m=+3058.663315557" watchObservedRunningTime="2026-04-22 20:14:04.255120377 +0000 UTC m=+3058.664534890" Apr 22 20:14:05.243078 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:05.243039 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 22 20:14:15.244589 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:15.244559 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:14:27.525194 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.525162 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6"] Apr 22 20:14:27.525605 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.525424 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="kserve-container" containerID="cri-o://b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba" gracePeriod=30 Apr 22 20:14:27.567303 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.567268 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx"] Apr 22 20:14:27.574959 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.574935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:14:27.578202 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.578175 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx"] Apr 22 20:14:27.699047 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.699022 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c143d5c2-9b31-49d3-8aaa-df257d9937d4-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-t7lsx\" (UID: \"c143d5c2-9b31-49d3-8aaa-df257d9937d4\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:14:27.799934 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.799870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c143d5c2-9b31-49d3-8aaa-df257d9937d4-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-t7lsx\" (UID: \"c143d5c2-9b31-49d3-8aaa-df257d9937d4\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:14:27.800178 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.800163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c143d5c2-9b31-49d3-8aaa-df257d9937d4-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-t7lsx\" (UID: \"c143d5c2-9b31-49d3-8aaa-df257d9937d4\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:14:27.885705 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:27.885682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:14:28.001988 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.001961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx"] Apr 22 20:14:28.308138 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.308064 2572 generic.go:358] "Generic (PLEG): container finished" podID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerID="ee4cff1b31da604548f197d878286e5f679c74b513ae2d2c7bc280bb590ba65b" exitCode=137 Apr 22 20:14:28.308278 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.308142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" event={"ID":"cab6bd68-5878-47db-b45a-1e84ee2fe52e","Type":"ContainerDied","Data":"ee4cff1b31da604548f197d878286e5f679c74b513ae2d2c7bc280bb590ba65b"} Apr 22 20:14:28.309445 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.309422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" event={"ID":"c143d5c2-9b31-49d3-8aaa-df257d9937d4","Type":"ContainerStarted","Data":"b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d"} Apr 22 20:14:28.309560 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.309450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" event={"ID":"c143d5c2-9b31-49d3-8aaa-df257d9937d4","Type":"ContainerStarted","Data":"5dbe9c6b1d5c97ba07ddf8b8f891d9509a708291ac7e8410699eba5ecb97b920"} Apr 22 20:14:28.684525 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.684505 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:14:28.806443 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.806417 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab6bd68-5878-47db-b45a-1e84ee2fe52e-kserve-provision-location\") pod \"cab6bd68-5878-47db-b45a-1e84ee2fe52e\" (UID: \"cab6bd68-5878-47db-b45a-1e84ee2fe52e\") " Apr 22 20:14:28.815010 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.814981 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab6bd68-5878-47db-b45a-1e84ee2fe52e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cab6bd68-5878-47db-b45a-1e84ee2fe52e" (UID: "cab6bd68-5878-47db-b45a-1e84ee2fe52e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:28.907352 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:28.907329 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab6bd68-5878-47db-b45a-1e84ee2fe52e-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:14:29.313237 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:29.313206 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" Apr 22 20:14:29.313481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:29.313210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p" event={"ID":"cab6bd68-5878-47db-b45a-1e84ee2fe52e","Type":"ContainerDied","Data":"4bf806086a84bf63133e536948b821789abe056a9d11ac309abbd5a248416dda"} Apr 22 20:14:29.313481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:29.313332 2572 scope.go:117] "RemoveContainer" containerID="ee4cff1b31da604548f197d878286e5f679c74b513ae2d2c7bc280bb590ba65b" Apr 22 20:14:29.321087 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:29.320921 2572 scope.go:117] "RemoveContainer" containerID="6972298233f4d0a6e5672d6cef4e17eaadb8c9ba2a3b4cd077558d773e3667f5" Apr 22 20:14:29.333942 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:29.333918 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p"] Apr 22 20:14:29.337561 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:29.337541 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ln25p"] Apr 22 20:14:30.169289 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:30.169255 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" path="/var/lib/kubelet/pods/cab6bd68-5878-47db-b45a-1e84ee2fe52e/volumes" Apr 22 20:14:32.324619 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:32.324542 2572 generic.go:358] "Generic (PLEG): container finished" podID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerID="b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d" exitCode=0 Apr 22 20:14:32.324944 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:32.324622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" event={"ID":"c143d5c2-9b31-49d3-8aaa-df257d9937d4","Type":"ContainerDied","Data":"b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d"} Apr 22 20:14:58.200965 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.200946 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:14:58.256160 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.256127 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/241f9e65-8a4a-454a-a44e-c191c97aa38e-kserve-provision-location\") pod \"241f9e65-8a4a-454a-a44e-c191c97aa38e\" (UID: \"241f9e65-8a4a-454a-a44e-c191c97aa38e\") " Apr 22 20:14:58.260011 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.259985 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241f9e65-8a4a-454a-a44e-c191c97aa38e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "241f9e65-8a4a-454a-a44e-c191c97aa38e" (UID: "241f9e65-8a4a-454a-a44e-c191c97aa38e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:58.357195 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.357118 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/241f9e65-8a4a-454a-a44e-c191c97aa38e-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:14:58.439864 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.439826 2572 generic.go:358] "Generic (PLEG): container finished" podID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerID="b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba" exitCode=137 Apr 22 20:14:58.440029 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.439897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" event={"ID":"241f9e65-8a4a-454a-a44e-c191c97aa38e","Type":"ContainerDied","Data":"b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba"} Apr 22 20:14:58.440029 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.439905 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" Apr 22 20:14:58.440029 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.439924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6" event={"ID":"241f9e65-8a4a-454a-a44e-c191c97aa38e","Type":"ContainerDied","Data":"8a1c16af6867869c6d9fa3a27c701f1a74076c3853bd718564bcc15703c43a8c"} Apr 22 20:14:58.440029 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.439971 2572 scope.go:117] "RemoveContainer" containerID="b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba" Apr 22 20:14:58.452777 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.452753 2572 scope.go:117] "RemoveContainer" containerID="9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469" Apr 22 20:14:58.462878 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.462766 2572 scope.go:117] "RemoveContainer" containerID="b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba" Apr 22 20:14:58.463577 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:14:58.463430 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba\": container with ID starting with b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba not found: ID does not exist" containerID="b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba" Apr 22 20:14:58.463577 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.463471 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba"} err="failed to get container status \"b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba\": rpc error: code = NotFound desc = could not find container \"b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba\": container with ID starting with b6483b5879ee2d06cd4ca7d03de4e19e93806affd1434f5ccc3e6564fac111ba not found: ID does not exist" Apr 22 20:14:58.463577 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.463496 2572 scope.go:117] "RemoveContainer" containerID="9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469" Apr 22 20:14:58.464194 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:14:58.464165 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469\": container with ID starting with 9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469 not found: ID does not exist" containerID="9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469" Apr 22 20:14:58.464299 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.464203 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469"} err="failed to get container status \"9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469\": rpc error: code = NotFound desc = could not find container \"9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469\": container with ID starting with 9b1c7133fcb087089f4e59bedc818eb9d45f2b7b3c9f19653bd0d2070057c469 not found: ID does not exist" Apr 22 20:14:58.465553 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.465531 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6"] Apr 22 20:14:58.469036 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:14:58.469013 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4b8f6"] Apr 22 20:15:00.171175 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:15:00.171076 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" path="/var/lib/kubelet/pods/241f9e65-8a4a-454a-a44e-c191c97aa38e/volumes" Apr 22 20:16:27.716455 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:27.716417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" event={"ID":"c143d5c2-9b31-49d3-8aaa-df257d9937d4","Type":"ContainerStarted","Data":"830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303"} Apr 22 20:16:27.716875 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:27.716643 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:16:27.717795 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:27.717772 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 22 20:16:27.737396 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:27.737315 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" podStartSLOduration=6.11779749 podStartE2EDuration="2m0.737300189s" podCreationTimestamp="2026-04-22 20:14:27 +0000 UTC" firstStartedPulling="2026-04-22 20:14:32.325688498 +0000 UTC m=+3086.735102989" lastFinishedPulling="2026-04-22 20:16:26.945191181 +0000 UTC m=+3201.354605688" observedRunningTime="2026-04-22 20:16:27.737152073 +0000 UTC m=+3202.146566587" watchObservedRunningTime="2026-04-22 20:16:27.737300189 +0000 UTC m=+3202.146714703" Apr 22 20:16:28.720031 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:28.719996 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 22 20:16:38.721062 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:38.721035 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:16:48.986929 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:48.986885 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx"] Apr 22 20:16:48.987438 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:48.987212 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="kserve-container" containerID="cri-o://830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303" gracePeriod=30 Apr 22 20:16:49.072055 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.071987 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr"] Apr 22 20:16:49.072274 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072262 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="storage-initializer" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072276 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="storage-initializer" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072293 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="kserve-container" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072299 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="kserve-container" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072305 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="kserve-container" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072310 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="kserve-container" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072319 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="storage-initializer" Apr 22 20:16:49.072330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072324 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="storage-initializer" Apr 22 20:16:49.072567 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072384 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cab6bd68-5878-47db-b45a-1e84ee2fe52e" containerName="kserve-container" Apr 22 20:16:49.072567 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.072394 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="241f9e65-8a4a-454a-a44e-c191c97aa38e" containerName="kserve-container" Apr 22 20:16:49.076034 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.076015 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:16:49.084605 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.084582 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr"] Apr 22 20:16:49.137232 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.137207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41c0087b-676a-4f70-9da4-58e600b33a18-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-xc6kr\" (UID: \"41c0087b-676a-4f70-9da4-58e600b33a18\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:16:49.237968 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.237937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41c0087b-676a-4f70-9da4-58e600b33a18-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-xc6kr\" (UID: \"41c0087b-676a-4f70-9da4-58e600b33a18\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:16:49.238317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.238298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41c0087b-676a-4f70-9da4-58e600b33a18-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-xc6kr\" (UID: \"41c0087b-676a-4f70-9da4-58e600b33a18\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:16:49.386275 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.386219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:16:49.582883 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.582861 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr"] Apr 22 20:16:49.585495 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:16:49.585469 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c0087b_676a_4f70_9da4_58e600b33a18.slice/crio-c5dcf7803865d292fcc6ebaa4c77127a53db45950ce2002574e389b414518882 WatchSource:0}: Error finding container c5dcf7803865d292fcc6ebaa4c77127a53db45950ce2002574e389b414518882: Status 404 returned error can't find the container with id c5dcf7803865d292fcc6ebaa4c77127a53db45950ce2002574e389b414518882 Apr 22 20:16:49.587152 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.587131 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:16:49.777885 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.777852 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" event={"ID":"41c0087b-676a-4f70-9da4-58e600b33a18","Type":"ContainerStarted","Data":"ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd"} Apr 22 20:16:49.778000 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:49.777892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" event={"ID":"41c0087b-676a-4f70-9da4-58e600b33a18","Type":"ContainerStarted","Data":"c5dcf7803865d292fcc6ebaa4c77127a53db45950ce2002574e389b414518882"} Apr 22 20:16:51.617004 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.616982 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:16:51.655334 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.655308 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c143d5c2-9b31-49d3-8aaa-df257d9937d4-kserve-provision-location\") pod \"c143d5c2-9b31-49d3-8aaa-df257d9937d4\" (UID: \"c143d5c2-9b31-49d3-8aaa-df257d9937d4\") " Apr 22 20:16:51.655680 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.655643 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c143d5c2-9b31-49d3-8aaa-df257d9937d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c143d5c2-9b31-49d3-8aaa-df257d9937d4" (UID: "c143d5c2-9b31-49d3-8aaa-df257d9937d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:16:51.755733 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.755679 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c143d5c2-9b31-49d3-8aaa-df257d9937d4-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:16:51.784009 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.783978 2572 generic.go:358] "Generic (PLEG): container finished" podID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerID="830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303" exitCode=0 Apr 22 20:16:51.784112 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.784030 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" event={"ID":"c143d5c2-9b31-49d3-8aaa-df257d9937d4","Type":"ContainerDied","Data":"830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303"} Apr 22 20:16:51.784112 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.784050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" event={"ID":"c143d5c2-9b31-49d3-8aaa-df257d9937d4","Type":"ContainerDied","Data":"5dbe9c6b1d5c97ba07ddf8b8f891d9509a708291ac7e8410699eba5ecb97b920"} Apr 22 20:16:51.784112 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.784065 2572 scope.go:117] "RemoveContainer" containerID="830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303" Apr 22 20:16:51.784112 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.784082 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx" Apr 22 20:16:51.792057 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.791648 2572 scope.go:117] "RemoveContainer" containerID="b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d" Apr 22 20:16:51.799123 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.799105 2572 scope.go:117] "RemoveContainer" containerID="830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303" Apr 22 20:16:51.799384 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:16:51.799365 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303\": container with ID starting with 830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303 not found: ID does not exist" containerID="830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303" Apr 22 20:16:51.799424 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.799392 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303"} err="failed to get container status \"830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303\": rpc error: code = NotFound desc = could not find container \"830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303\": container with ID starting with 830b86271e5ee95a0264fbc4ade69bdee632c5f432bb006cf1ee9f8f44751303 not found: ID does not exist" Apr 22 20:16:51.799424 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.799408 2572 scope.go:117] "RemoveContainer" containerID="b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d" Apr 22 20:16:51.799630 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:16:51.799612 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d\": container with ID starting with b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d not found: ID does not exist" containerID="b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d" Apr 22 20:16:51.799682 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.799636 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d"} err="failed to get container status \"b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d\": rpc error: code = NotFound desc = could not find container \"b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d\": container with ID starting with b2918ceefda6d59ae2abc9d9ab3251b0ea5c87b1d526be3f513b97d68b26813d not found: ID does not exist" Apr 22 20:16:51.804536 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.804515 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx"] Apr 22 20:16:51.808177 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:51.808159 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-t7lsx"] Apr 22 20:16:52.169981 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:52.169955 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" path="/var/lib/kubelet/pods/c143d5c2-9b31-49d3-8aaa-df257d9937d4/volumes" Apr 22 20:16:53.791471 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:53.791434 2572 generic.go:358] "Generic (PLEG): container finished" podID="41c0087b-676a-4f70-9da4-58e600b33a18" containerID="ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd" exitCode=0 Apr 22 20:16:53.791860 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:16:53.791503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" event={"ID":"41c0087b-676a-4f70-9da4-58e600b33a18","Type":"ContainerDied","Data":"ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd"} Apr 22 20:17:11.843446 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:11.843404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" event={"ID":"41c0087b-676a-4f70-9da4-58e600b33a18","Type":"ContainerStarted","Data":"3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb"} Apr 22 20:17:11.844020 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:11.843696 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:17:11.844990 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:11.844965 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:17:11.859320 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:11.859277 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podStartSLOduration=5.6447743500000005 podStartE2EDuration="22.859263763s" podCreationTimestamp="2026-04-22 20:16:49 +0000 UTC" firstStartedPulling="2026-04-22 20:16:53.792776586 +0000 UTC m=+3228.202191077" lastFinishedPulling="2026-04-22 20:17:11.007265999 +0000 UTC m=+3245.416680490" observedRunningTime="2026-04-22 20:17:11.857770333 +0000 UTC m=+3246.267184857" watchObservedRunningTime="2026-04-22 20:17:11.859263763 +0000 UTC m=+3246.268678276" Apr 22 20:17:12.846059 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:12.846022 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:17:22.846462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:22.846415 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:17:32.846163 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:32.846119 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:17:42.846485 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:42.846440 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:17:52.846843 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:17:52.846800 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:18:02.846555 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:02.846510 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 22 20:18:06.326336 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:06.326310 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:18:06.345171 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:06.345149 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:18:12.847859 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:12.847824 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:18:19.192508 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.192475 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr"] Apr 22 20:18:19.192887 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.192820 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" containerID="cri-o://3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb" gracePeriod=30 Apr 22 20:18:19.263995 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.263968 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n"] Apr 22 20:18:19.264262 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.264251 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="kserve-container" Apr 22 20:18:19.264306 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.264264 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="kserve-container" Apr 22 20:18:19.264306 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.264277 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="storage-initializer" Apr 22 20:18:19.264306 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.264282 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="storage-initializer" Apr 22 20:18:19.264401 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.264326 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c143d5c2-9b31-49d3-8aaa-df257d9937d4" containerName="kserve-container" Apr 22 20:18:19.267161 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.267143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:19.274896 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.274877 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n"] Apr 22 20:18:19.364994 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.364973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640a50a-e7de-47bd-b620-b92d3fd7ee56-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n\" (UID: \"1640a50a-e7de-47bd-b620-b92d3fd7ee56\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:19.465840 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.465782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640a50a-e7de-47bd-b620-b92d3fd7ee56-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n\" (UID: \"1640a50a-e7de-47bd-b620-b92d3fd7ee56\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:19.466076 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.466060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640a50a-e7de-47bd-b620-b92d3fd7ee56-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n\" (UID: \"1640a50a-e7de-47bd-b620-b92d3fd7ee56\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:19.578219 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.578196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:19.723085 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:19.723058 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n"] Apr 22 20:18:19.725473 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:18:19.725447 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1640a50a_e7de_47bd_b620_b92d3fd7ee56.slice/crio-6fdae3a0b80e1d1770879d95351bd14cdbd905b2cf6cb3369d320cb99fa6fac6 WatchSource:0}: Error finding container 6fdae3a0b80e1d1770879d95351bd14cdbd905b2cf6cb3369d320cb99fa6fac6: Status 404 returned error can't find the container with id 6fdae3a0b80e1d1770879d95351bd14cdbd905b2cf6cb3369d320cb99fa6fac6 Apr 22 20:18:20.029060 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:20.028982 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" event={"ID":"1640a50a-e7de-47bd-b620-b92d3fd7ee56","Type":"ContainerStarted","Data":"85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5"} Apr 22 20:18:20.029060 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:20.029017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" event={"ID":"1640a50a-e7de-47bd-b620-b92d3fd7ee56","Type":"ContainerStarted","Data":"6fdae3a0b80e1d1770879d95351bd14cdbd905b2cf6cb3369d320cb99fa6fac6"} Apr 22 20:18:22.226517 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:22.226495 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:18:22.284523 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:22.284498 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41c0087b-676a-4f70-9da4-58e600b33a18-kserve-provision-location\") pod \"41c0087b-676a-4f70-9da4-58e600b33a18\" (UID: \"41c0087b-676a-4f70-9da4-58e600b33a18\") " Apr 22 20:18:22.284809 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:22.284785 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c0087b-676a-4f70-9da4-58e600b33a18-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41c0087b-676a-4f70-9da4-58e600b33a18" (UID: "41c0087b-676a-4f70-9da4-58e600b33a18"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:18:22.385913 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:22.385856 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41c0087b-676a-4f70-9da4-58e600b33a18-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:18:23.040142 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.040103 2572 generic.go:358] "Generic (PLEG): container finished" podID="41c0087b-676a-4f70-9da4-58e600b33a18" containerID="3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb" exitCode=0 Apr 22 20:18:23.040317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.040178 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" Apr 22 20:18:23.040317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.040191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" event={"ID":"41c0087b-676a-4f70-9da4-58e600b33a18","Type":"ContainerDied","Data":"3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb"} Apr 22 20:18:23.040317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.040229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr" event={"ID":"41c0087b-676a-4f70-9da4-58e600b33a18","Type":"ContainerDied","Data":"c5dcf7803865d292fcc6ebaa4c77127a53db45950ce2002574e389b414518882"} Apr 22 20:18:23.040317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.040244 2572 scope.go:117] "RemoveContainer" containerID="3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb" Apr 22 20:18:23.048407 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.048389 2572 scope.go:117] "RemoveContainer" containerID="ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd" Apr 22 20:18:23.055004 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.054990 2572 scope.go:117] "RemoveContainer" containerID="3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb" Apr 22 20:18:23.055226 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:18:23.055207 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb\": container with ID starting with 3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb not found: ID does not exist" containerID="3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb" Apr 22 20:18:23.055287 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.055232 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb"} err="failed to get container status \"3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb\": rpc error: code = NotFound desc = could not find container \"3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb\": container with ID starting with 3811ac5f3ef4b7e16d78a65086dde04ff957dbca6cf84fa46f8a391501234bcb not found: ID does not exist" Apr 22 20:18:23.055287 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.055249 2572 scope.go:117] "RemoveContainer" containerID="ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd" Apr 22 20:18:23.055444 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:18:23.055430 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd\": container with ID starting with ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd not found: ID does not exist" containerID="ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd" Apr 22 20:18:23.055484 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.055448 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd"} err="failed to get container status \"ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd\": rpc error: code = NotFound desc = could not find container \"ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd\": container with ID starting with ac1b4489b5c0f34ec4323abad7bcf7d95c0fd7140def834342b40c3aa7b5b0bd not found: ID does not exist" Apr 22 20:18:23.061302 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.061282 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr"] Apr 22 20:18:23.064593 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:23.064572 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-xc6kr"] Apr 22 20:18:24.043885 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:24.043861 2572 generic.go:358] "Generic (PLEG): container finished" podID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerID="85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5" exitCode=0 Apr 22 20:18:24.044250 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:24.043939 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" event={"ID":"1640a50a-e7de-47bd-b620-b92d3fd7ee56","Type":"ContainerDied","Data":"85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5"} Apr 22 20:18:24.170371 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:24.170347 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" path="/var/lib/kubelet/pods/41c0087b-676a-4f70-9da4-58e600b33a18/volumes" Apr 22 20:18:25.049380 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:25.049347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" event={"ID":"1640a50a-e7de-47bd-b620-b92d3fd7ee56","Type":"ContainerStarted","Data":"138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69"} Apr 22 20:18:25.049767 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:25.049546 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:25.064484 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:25.064440 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" podStartSLOduration=6.064428376 podStartE2EDuration="6.064428376s" podCreationTimestamp="2026-04-22 20:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:18:25.063688471 +0000 UTC m=+3319.473102977" watchObservedRunningTime="2026-04-22 20:18:25.064428376 +0000 UTC m=+3319.473842889" Apr 22 20:18:56.133522 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:56.133492 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:18:59.356563 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.356524 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n"] Apr 22 20:18:59.357120 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.356954 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerName="kserve-container" containerID="cri-o://138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69" gracePeriod=30 Apr 22 20:18:59.412672 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.412644 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2"] Apr 22 20:18:59.412956 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.412944 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" Apr 22 20:18:59.413006 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.412958 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" Apr 22 20:18:59.413006 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.412970 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="storage-initializer" Apr 22 20:18:59.413006 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.412976 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="storage-initializer" Apr 22 20:18:59.413109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.413021 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="41c0087b-676a-4f70-9da4-58e600b33a18" containerName="kserve-container" Apr 22 20:18:59.416072 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.416055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:18:59.423173 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.423154 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2"] Apr 22 20:18:59.536987 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.536964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e404e1cc-3d26-49eb-8691-52b6e0ec142a-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-nktm2\" (UID: \"e404e1cc-3d26-49eb-8691-52b6e0ec142a\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:18:59.638159 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.638098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e404e1cc-3d26-49eb-8691-52b6e0ec142a-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-nktm2\" (UID: \"e404e1cc-3d26-49eb-8691-52b6e0ec142a\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:18:59.638405 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.638390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e404e1cc-3d26-49eb-8691-52b6e0ec142a-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-nktm2\" (UID: \"e404e1cc-3d26-49eb-8691-52b6e0ec142a\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:18:59.725735 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.725710 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:18:59.842325 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:18:59.842300 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2"] Apr 22 20:18:59.844355 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:18:59.844321 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode404e1cc_3d26_49eb_8691_52b6e0ec142a.slice/crio-53ff65b2da156dd2ed3e4262715b6202abcec0c2ef42e93932625de5030fbf77 WatchSource:0}: Error finding container 53ff65b2da156dd2ed3e4262715b6202abcec0c2ef42e93932625de5030fbf77: Status 404 returned error can't find the container with id 53ff65b2da156dd2ed3e4262715b6202abcec0c2ef42e93932625de5030fbf77 Apr 22 20:19:00.154341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:00.154308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" event={"ID":"e404e1cc-3d26-49eb-8691-52b6e0ec142a","Type":"ContainerStarted","Data":"aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296"} Apr 22 20:19:00.154341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:00.154341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" event={"ID":"e404e1cc-3d26-49eb-8691-52b6e0ec142a","Type":"ContainerStarted","Data":"53ff65b2da156dd2ed3e4262715b6202abcec0c2ef42e93932625de5030fbf77"} Apr 22 20:19:04.166251 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:04.166221 2572 generic.go:358] "Generic (PLEG): container finished" podID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerID="aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296" exitCode=0 Apr 22 20:19:04.169040 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:04.169012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" event={"ID":"e404e1cc-3d26-49eb-8691-52b6e0ec142a","Type":"ContainerDied","Data":"aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296"} Apr 22 20:19:04.589857 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:04.589836 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:19:04.678806 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:04.678780 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640a50a-e7de-47bd-b620-b92d3fd7ee56-kserve-provision-location\") pod \"1640a50a-e7de-47bd-b620-b92d3fd7ee56\" (UID: \"1640a50a-e7de-47bd-b620-b92d3fd7ee56\") " Apr 22 20:19:04.679047 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:04.679026 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640a50a-e7de-47bd-b620-b92d3fd7ee56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1640a50a-e7de-47bd-b620-b92d3fd7ee56" (UID: "1640a50a-e7de-47bd-b620-b92d3fd7ee56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:19:04.780272 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:04.780219 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640a50a-e7de-47bd-b620-b92d3fd7ee56-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:19:05.170425 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.170394 2572 generic.go:358] "Generic (PLEG): container finished" podID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerID="138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69" exitCode=0 Apr 22 20:19:05.170867 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.170470 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" Apr 22 20:19:05.170867 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.170477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" event={"ID":"1640a50a-e7de-47bd-b620-b92d3fd7ee56","Type":"ContainerDied","Data":"138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69"} Apr 22 20:19:05.170867 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.170507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n" event={"ID":"1640a50a-e7de-47bd-b620-b92d3fd7ee56","Type":"ContainerDied","Data":"6fdae3a0b80e1d1770879d95351bd14cdbd905b2cf6cb3369d320cb99fa6fac6"} Apr 22 20:19:05.170867 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.170522 2572 scope.go:117] "RemoveContainer" containerID="138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69" Apr 22 20:19:05.172165 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.172147 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" event={"ID":"e404e1cc-3d26-49eb-8691-52b6e0ec142a","Type":"ContainerStarted","Data":"0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86"} Apr 22 20:19:05.172338 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.172323 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:19:05.178971 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.178956 2572 scope.go:117] "RemoveContainer" containerID="85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5" Apr 22 20:19:05.185579 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.185564 2572 scope.go:117] "RemoveContainer" containerID="138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69" Apr 22 20:19:05.185806 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:19:05.185784 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69\": container with ID starting with 138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69 not found: ID does not exist" containerID="138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69" Apr 22 20:19:05.185866 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.185814 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69"} err="failed to get container status \"138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69\": rpc error: code = NotFound desc = could not find container \"138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69\": container with ID starting with 138b8b8fdaa2c58b167bd0b6546fa03f06cd98fdd49e3d08e96d0ced4f86de69 not found: ID does not exist" Apr 22 20:19:05.185866 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.185831 2572 scope.go:117] "RemoveContainer" containerID="85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5" Apr 22 20:19:05.186029 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:19:05.186014 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5\": container with ID starting with 85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5 not found: ID does not exist" containerID="85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5" Apr 22 20:19:05.186073 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.186034 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5"} err="failed to get container status \"85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5\": rpc error: code = NotFound desc = could not find container \"85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5\": container with ID starting with 85dfd421f0da9100c9341465f4d990571ffbe82d1a2c23599e94754ca71f72a5 not found: ID does not exist" Apr 22 20:19:05.189741 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.189703 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" podStartSLOduration=6.189692249 podStartE2EDuration="6.189692249s" podCreationTimestamp="2026-04-22 20:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:19:05.187878336 +0000 UTC m=+3359.597292848" watchObservedRunningTime="2026-04-22 20:19:05.189692249 +0000 UTC m=+3359.599106822" Apr 22 20:19:05.199102 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.199083 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n"] Apr 22 20:19:05.202996 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:05.202977 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-nhs5n"] Apr 22 20:19:06.169728 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:06.169694 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" path="/var/lib/kubelet/pods/1640a50a-e7de-47bd-b620-b92d3fd7ee56/volumes" Apr 22 20:19:36.232804 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:36.232767 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:19:39.553695 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.553649 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2"] Apr 22 20:19:39.554060 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.553926 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerName="kserve-container" containerID="cri-o://0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86" gracePeriod=30 Apr 22 20:19:39.585690 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.585645 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m"] Apr 22 20:19:39.586095 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.586075 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerName="storage-initializer" Apr 22 20:19:39.586095 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.586095 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerName="storage-initializer" Apr 22 20:19:39.586212 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.586116 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerName="kserve-container" Apr 22 20:19:39.586212 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.586121 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerName="kserve-container" Apr 22 20:19:39.586212 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.586168 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1640a50a-e7de-47bd-b620-b92d3fd7ee56" containerName="kserve-container" Apr 22 20:19:39.589325 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.589310 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:19:39.597391 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.597370 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m"] Apr 22 20:19:39.712873 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.712849 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df419fcc-7508-49a2-817e-3eb86ca46262-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-dtz8m\" (UID: \"df419fcc-7508-49a2-817e-3eb86ca46262\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:19:39.813489 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.813427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df419fcc-7508-49a2-817e-3eb86ca46262-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-dtz8m\" (UID: \"df419fcc-7508-49a2-817e-3eb86ca46262\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:19:39.813771 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.813755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df419fcc-7508-49a2-817e-3eb86ca46262-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-dtz8m\" (UID: \"df419fcc-7508-49a2-817e-3eb86ca46262\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:19:39.899531 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:39.899512 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:19:40.051850 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:40.051815 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m"] Apr 22 20:19:40.054790 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:19:40.054764 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf419fcc_7508_49a2_817e_3eb86ca46262.slice/crio-f1f04a34447ced9180b31772cb8ffe07ebb392ba18810926c55153136e96cea2 WatchSource:0}: Error finding container f1f04a34447ced9180b31772cb8ffe07ebb392ba18810926c55153136e96cea2: Status 404 returned error can't find the container with id f1f04a34447ced9180b31772cb8ffe07ebb392ba18810926c55153136e96cea2 Apr 22 20:19:40.270951 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:40.270921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" event={"ID":"df419fcc-7508-49a2-817e-3eb86ca46262","Type":"ContainerStarted","Data":"3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874"} Apr 22 20:19:40.270951 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:40.270954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" event={"ID":"df419fcc-7508-49a2-817e-3eb86ca46262","Type":"ContainerStarted","Data":"f1f04a34447ced9180b31772cb8ffe07ebb392ba18810926c55153136e96cea2"} Apr 22 20:19:44.283465 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:44.283436 2572 generic.go:358] "Generic (PLEG): container finished" podID="df419fcc-7508-49a2-817e-3eb86ca46262" containerID="3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874" exitCode=0 Apr 22 20:19:44.283856 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:44.283511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" event={"ID":"df419fcc-7508-49a2-817e-3eb86ca46262","Type":"ContainerDied","Data":"3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874"} Apr 22 20:19:44.788384 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:44.788361 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:19:44.956206 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:44.956181 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e404e1cc-3d26-49eb-8691-52b6e0ec142a-kserve-provision-location\") pod \"e404e1cc-3d26-49eb-8691-52b6e0ec142a\" (UID: \"e404e1cc-3d26-49eb-8691-52b6e0ec142a\") " Apr 22 20:19:44.956464 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:44.956444 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e404e1cc-3d26-49eb-8691-52b6e0ec142a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e404e1cc-3d26-49eb-8691-52b6e0ec142a" (UID: "e404e1cc-3d26-49eb-8691-52b6e0ec142a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:19:45.057001 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.056979 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e404e1cc-3d26-49eb-8691-52b6e0ec142a-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:19:45.287374 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.287306 2572 generic.go:358] "Generic (PLEG): container finished" podID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerID="0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86" exitCode=0 Apr 22 20:19:45.287807 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.287382 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" Apr 22 20:19:45.287807 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.287386 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" event={"ID":"e404e1cc-3d26-49eb-8691-52b6e0ec142a","Type":"ContainerDied","Data":"0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86"} Apr 22 20:19:45.287807 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.287418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2" event={"ID":"e404e1cc-3d26-49eb-8691-52b6e0ec142a","Type":"ContainerDied","Data":"53ff65b2da156dd2ed3e4262715b6202abcec0c2ef42e93932625de5030fbf77"} Apr 22 20:19:45.287807 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.287437 2572 scope.go:117] "RemoveContainer" containerID="0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86" Apr 22 20:19:45.289213 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.289190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" event={"ID":"df419fcc-7508-49a2-817e-3eb86ca46262","Type":"ContainerStarted","Data":"ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827"} Apr 22 20:19:45.289487 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.289465 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:19:45.290904 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.290874 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:19:45.295467 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.295455 2572 scope.go:117] "RemoveContainer" containerID="aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296" Apr 22 20:19:45.302433 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.302416 2572 scope.go:117] "RemoveContainer" containerID="0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86" Apr 22 20:19:45.302686 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:19:45.302651 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86\": container with ID starting with 0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86 not found: ID does not exist" containerID="0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86" Apr 22 20:19:45.302763 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.302690 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86"} err="failed to get container status \"0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86\": rpc error: code = NotFound desc = could not find container \"0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86\": container with ID starting with 0a2d3dc798af44a4422af85f5740c14f57fb58452686ecb9ab223814f1dd6b86 not found: ID does not exist" Apr 22 20:19:45.302763 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.302708 2572 scope.go:117] "RemoveContainer" containerID="aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296" Apr 22 20:19:45.302945 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:19:45.302927 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296\": container with ID starting with aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296 not found: ID does not exist" containerID="aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296" Apr 22 20:19:45.302993 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.302955 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296"} err="failed to get container status \"aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296\": rpc error: code = NotFound desc = could not find container \"aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296\": container with ID starting with aa35d28608fc17955c7fe7268e607dc780bb667ca6870f87c07dc45fcca6b296 not found: ID does not exist" Apr 22 20:19:45.306964 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.306927 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podStartSLOduration=6.306916828 podStartE2EDuration="6.306916828s" podCreationTimestamp="2026-04-22 20:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:19:45.304866172 +0000 UTC m=+3399.714280687" watchObservedRunningTime="2026-04-22 20:19:45.306916828 +0000 UTC m=+3399.716331341" Apr 22 20:19:45.316977 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.316951 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2"] Apr 22 20:19:45.318556 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:45.318535 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-nktm2"] Apr 22 20:19:46.169567 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:46.169536 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" path="/var/lib/kubelet/pods/e404e1cc-3d26-49eb-8691-52b6e0ec142a/volumes" Apr 22 20:19:46.293460 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:46.293428 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:19:56.294224 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:19:56.294138 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:20:06.293684 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:06.293621 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:20:16.293922 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:16.293877 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:20:26.293480 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:26.293441 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:20:36.294309 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:36.294270 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 22 20:20:46.295324 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:46.295290 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:20:49.691610 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.691575 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m"] Apr 22 20:20:49.692002 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.691887 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" containerID="cri-o://ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827" gracePeriod=30 Apr 22 20:20:49.740895 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.740863 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr"] Apr 22 20:20:49.741225 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.741208 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerName="kserve-container" Apr 22 20:20:49.741317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.741227 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerName="kserve-container" Apr 22 20:20:49.741317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.741250 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerName="storage-initializer" Apr 22 20:20:49.741317 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.741259 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerName="storage-initializer" Apr 22 20:20:49.741489 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.741347 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e404e1cc-3d26-49eb-8691-52b6e0ec142a" containerName="kserve-container" Apr 22 20:20:49.744247 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.744228 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:20:49.751438 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.751418 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr"] Apr 22 20:20:49.806582 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.806558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0603c7f6-bbcb-4fe6-a43c-4e3835491adf-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr\" (UID: \"0603c7f6-bbcb-4fe6-a43c-4e3835491adf\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:20:49.906974 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.906950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0603c7f6-bbcb-4fe6-a43c-4e3835491adf-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr\" (UID: \"0603c7f6-bbcb-4fe6-a43c-4e3835491adf\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:20:49.907277 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:49.907258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0603c7f6-bbcb-4fe6-a43c-4e3835491adf-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr\" (UID: \"0603c7f6-bbcb-4fe6-a43c-4e3835491adf\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:20:50.054611 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:50.054545 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:20:50.169971 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:50.169950 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr"] Apr 22 20:20:50.170768 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:20:50.170745 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0603c7f6_bbcb_4fe6_a43c_4e3835491adf.slice/crio-6a824115119fb231f359a25e409e65bfb8978bd3b8f7da623386b9893a235199 WatchSource:0}: Error finding container 6a824115119fb231f359a25e409e65bfb8978bd3b8f7da623386b9893a235199: Status 404 returned error can't find the container with id 6a824115119fb231f359a25e409e65bfb8978bd3b8f7da623386b9893a235199 Apr 22 20:20:50.470498 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:50.470459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" event={"ID":"0603c7f6-bbcb-4fe6-a43c-4e3835491adf","Type":"ContainerStarted","Data":"a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee"} Apr 22 20:20:50.470498 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:50.470500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" event={"ID":"0603c7f6-bbcb-4fe6-a43c-4e3835491adf","Type":"ContainerStarted","Data":"6a824115119fb231f359a25e409e65bfb8978bd3b8f7da623386b9893a235199"} Apr 22 20:20:52.718144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:52.718125 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:20:52.828184 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:52.828115 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df419fcc-7508-49a2-817e-3eb86ca46262-kserve-provision-location\") pod \"df419fcc-7508-49a2-817e-3eb86ca46262\" (UID: \"df419fcc-7508-49a2-817e-3eb86ca46262\") " Apr 22 20:20:52.828419 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:52.828396 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df419fcc-7508-49a2-817e-3eb86ca46262-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df419fcc-7508-49a2-817e-3eb86ca46262" (UID: "df419fcc-7508-49a2-817e-3eb86ca46262"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:52.929462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:52.929422 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df419fcc-7508-49a2-817e-3eb86ca46262-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:20:53.481260 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.481215 2572 generic.go:358] "Generic (PLEG): container finished" podID="df419fcc-7508-49a2-817e-3eb86ca46262" containerID="ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827" exitCode=0 Apr 22 20:20:53.481472 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.481279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" event={"ID":"df419fcc-7508-49a2-817e-3eb86ca46262","Type":"ContainerDied","Data":"ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827"} Apr 22 20:20:53.481472 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.481292 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" Apr 22 20:20:53.481472 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.481315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m" event={"ID":"df419fcc-7508-49a2-817e-3eb86ca46262","Type":"ContainerDied","Data":"f1f04a34447ced9180b31772cb8ffe07ebb392ba18810926c55153136e96cea2"} Apr 22 20:20:53.481472 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.481337 2572 scope.go:117] "RemoveContainer" containerID="ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827" Apr 22 20:20:53.490256 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.490237 2572 scope.go:117] "RemoveContainer" containerID="3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874" Apr 22 20:20:53.497864 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.497848 2572 scope.go:117] "RemoveContainer" containerID="ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827" Apr 22 20:20:53.498171 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:20:53.498144 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827\": container with ID starting with ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827 not found: ID does not exist" containerID="ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827" Apr 22 20:20:53.498276 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.498173 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827"} err="failed to get container status \"ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827\": rpc error: code = NotFound desc = could not find container \"ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827\": container with ID starting with ed33a2a5616bec1467ca04e9b7bd9de9a1a9b9142fb2e5793928fb8d12dc8827 not found: ID does not exist" Apr 22 20:20:53.498276 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.498205 2572 scope.go:117] "RemoveContainer" containerID="3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874" Apr 22 20:20:53.498467 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:20:53.498451 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874\": container with ID starting with 3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874 not found: ID does not exist" containerID="3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874" Apr 22 20:20:53.498506 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.498470 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874"} err="failed to get container status \"3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874\": rpc error: code = NotFound desc = could not find container \"3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874\": container with ID starting with 3de0c5270c16113ec66c644ab38f56de707c69844fbbdd29c58a2d8c69f07874 not found: ID does not exist" Apr 22 20:20:53.507079 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.507045 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m"] Apr 22 20:20:53.507201 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:53.507095 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dtz8m"] Apr 22 20:20:54.169701 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:54.169656 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" path="/var/lib/kubelet/pods/df419fcc-7508-49a2-817e-3eb86ca46262/volumes" Apr 22 20:20:54.485390 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:54.485316 2572 generic.go:358] "Generic (PLEG): container finished" podID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerID="a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee" exitCode=0 Apr 22 20:20:54.485544 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:54.485399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" event={"ID":"0603c7f6-bbcb-4fe6-a43c-4e3835491adf","Type":"ContainerDied","Data":"a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee"} Apr 22 20:20:55.490788 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:55.490753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" event={"ID":"0603c7f6-bbcb-4fe6-a43c-4e3835491adf","Type":"ContainerStarted","Data":"cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d"} Apr 22 20:20:55.491185 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:55.490979 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:20:55.507049 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:20:55.507008 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" podStartSLOduration=6.506994183 podStartE2EDuration="6.506994183s" podCreationTimestamp="2026-04-22 20:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:20:55.506594677 +0000 UTC m=+3469.916009190" watchObservedRunningTime="2026-04-22 20:20:55.506994183 +0000 UTC m=+3469.916408696" Apr 22 20:21:26.531770 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:26.531652 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 20:21:36.496933 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:36.496904 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:21:39.849033 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.849000 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr"] Apr 22 20:21:39.849443 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.849286 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="kserve-container" containerID="cri-o://cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d" gracePeriod=30 Apr 22 20:21:39.913190 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.913154 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls"] Apr 22 20:21:39.913532 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.913514 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="storage-initializer" Apr 22 20:21:39.913624 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.913533 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="storage-initializer" Apr 22 20:21:39.913624 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.913572 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" Apr 22 20:21:39.913624 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.913581 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" Apr 22 20:21:39.913824 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.913655 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="df419fcc-7508-49a2-817e-3eb86ca46262" containerName="kserve-container" Apr 22 20:21:39.916889 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.916870 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:21:39.927921 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:39.927900 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls"] Apr 22 20:21:40.061999 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.061968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef2c8de5-7360-4808-a21f-6b8ffcba776b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-fx2ls\" (UID: \"ef2c8de5-7360-4808-a21f-6b8ffcba776b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:21:40.163146 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.163112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef2c8de5-7360-4808-a21f-6b8ffcba776b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-fx2ls\" (UID: \"ef2c8de5-7360-4808-a21f-6b8ffcba776b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:21:40.163462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.163444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef2c8de5-7360-4808-a21f-6b8ffcba776b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-fx2ls\" (UID: \"ef2c8de5-7360-4808-a21f-6b8ffcba776b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:21:40.226320 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.226295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:21:40.342824 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.342800 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls"] Apr 22 20:21:40.345371 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:21:40.345345 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef2c8de5_7360_4808_a21f_6b8ffcba776b.slice/crio-c8ff34b92791daa77ccbbbaed8457f43660317e1100ddb919773e5eb111c3327 WatchSource:0}: Error finding container c8ff34b92791daa77ccbbbaed8457f43660317e1100ddb919773e5eb111c3327: Status 404 returned error can't find the container with id c8ff34b92791daa77ccbbbaed8457f43660317e1100ddb919773e5eb111c3327 Apr 22 20:21:40.615490 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.615407 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" event={"ID":"ef2c8de5-7360-4808-a21f-6b8ffcba776b","Type":"ContainerStarted","Data":"cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4"} Apr 22 20:21:40.615490 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:40.615441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" event={"ID":"ef2c8de5-7360-4808-a21f-6b8ffcba776b","Type":"ContainerStarted","Data":"c8ff34b92791daa77ccbbbaed8457f43660317e1100ddb919773e5eb111c3327"} Apr 22 20:21:44.628627 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:44.628544 2572 generic.go:358] "Generic (PLEG): container finished" podID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerID="cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4" exitCode=0 Apr 22 20:21:44.628975 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:44.628621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" event={"ID":"ef2c8de5-7360-4808-a21f-6b8ffcba776b","Type":"ContainerDied","Data":"cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4"} Apr 22 20:21:45.633827 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:45.633781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" event={"ID":"ef2c8de5-7360-4808-a21f-6b8ffcba776b","Type":"ContainerStarted","Data":"e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a"} Apr 22 20:21:45.634270 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:45.634151 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:21:45.635279 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:45.635245 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:21:45.649926 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:45.649885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podStartSLOduration=6.649874463 podStartE2EDuration="6.649874463s" podCreationTimestamp="2026-04-22 20:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:21:45.648711218 +0000 UTC m=+3520.058125732" watchObservedRunningTime="2026-04-22 20:21:45.649874463 +0000 UTC m=+3520.059288975" Apr 22 20:21:46.118868 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.118846 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:21:46.208803 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.208740 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0603c7f6-bbcb-4fe6-a43c-4e3835491adf-kserve-provision-location\") pod \"0603c7f6-bbcb-4fe6-a43c-4e3835491adf\" (UID: \"0603c7f6-bbcb-4fe6-a43c-4e3835491adf\") " Apr 22 20:21:46.209065 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.209040 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0603c7f6-bbcb-4fe6-a43c-4e3835491adf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0603c7f6-bbcb-4fe6-a43c-4e3835491adf" (UID: "0603c7f6-bbcb-4fe6-a43c-4e3835491adf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:21:46.309267 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.309246 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0603c7f6-bbcb-4fe6-a43c-4e3835491adf-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:21:46.638048 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.637961 2572 generic.go:358] "Generic (PLEG): container finished" podID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerID="cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d" exitCode=0 Apr 22 20:21:46.638048 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.638038 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" Apr 22 20:21:46.638540 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.638047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" event={"ID":"0603c7f6-bbcb-4fe6-a43c-4e3835491adf","Type":"ContainerDied","Data":"cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d"} Apr 22 20:21:46.638540 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.638085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr" event={"ID":"0603c7f6-bbcb-4fe6-a43c-4e3835491adf","Type":"ContainerDied","Data":"6a824115119fb231f359a25e409e65bfb8978bd3b8f7da623386b9893a235199"} Apr 22 20:21:46.638540 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.638105 2572 scope.go:117] "RemoveContainer" containerID="cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d" Apr 22 20:21:46.638736 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.638544 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:21:46.645842 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.645820 2572 scope.go:117] "RemoveContainer" containerID="a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee" Apr 22 20:21:46.652628 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.652614 2572 scope.go:117] "RemoveContainer" containerID="cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d" Apr 22 20:21:46.652867 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:21:46.652847 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d\": container with ID starting with cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d not found: ID does not exist" containerID="cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d" Apr 22 20:21:46.652905 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.652876 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d"} err="failed to get container status \"cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d\": rpc error: code = NotFound desc = could not find container \"cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d\": container with ID starting with cedc8b34530449fdc948e7dc1767a85e3773f3e13c9a5664759f5377bfd53c3d not found: ID does not exist" Apr 22 20:21:46.652905 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.652893 2572 scope.go:117] "RemoveContainer" containerID="a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee" Apr 22 20:21:46.653095 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:21:46.653082 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee\": container with ID starting with a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee not found: ID does not exist" containerID="a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee" Apr 22 20:21:46.653131 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.653100 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee"} err="failed to get container status \"a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee\": rpc error: code = NotFound desc = could not find container \"a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee\": container with ID starting with a79c12389839b1440602118f793fbb98528fc0d26d1223c1197725b2fb4425ee not found: ID does not exist" Apr 22 20:21:46.658712 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.658690 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr"] Apr 22 20:21:46.664558 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:46.664539 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-whklr"] Apr 22 20:21:48.170551 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:48.170516 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" path="/var/lib/kubelet/pods/0603c7f6-bbcb-4fe6-a43c-4e3835491adf/volumes" Apr 22 20:21:56.639119 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:21:56.639022 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:22:06.638852 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:06.638810 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:22:16.638702 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:16.638643 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:22:26.638734 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:26.638690 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:22:36.638626 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:36.638586 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 22 20:22:46.639622 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:46.639593 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:22:50.035050 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.035016 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls"] Apr 22 20:22:50.035449 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.035248 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" containerID="cri-o://e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a" gracePeriod=30 Apr 22 20:22:50.077723 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.077694 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz"] Apr 22 20:22:50.078360 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.078335 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="kserve-container" Apr 22 20:22:50.078462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.078363 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="kserve-container" Apr 22 20:22:50.078462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.078407 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="storage-initializer" Apr 22 20:22:50.078462 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.078417 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="storage-initializer" Apr 22 20:22:50.078621 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.078558 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0603c7f6-bbcb-4fe6-a43c-4e3835491adf" containerName="kserve-container" Apr 22 20:22:50.082280 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.082261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:22:50.084846 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.084828 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 22 20:22:50.087473 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.087452 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz"] Apr 22 20:22:50.154035 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.154015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/485ab14e-a760-48bc-a701-1e8305f18717-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-b7b485879-52qxz\" (UID: \"485ab14e-a760-48bc-a701-1e8305f18717\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:22:50.255081 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.255057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/485ab14e-a760-48bc-a701-1e8305f18717-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-b7b485879-52qxz\" (UID: \"485ab14e-a760-48bc-a701-1e8305f18717\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:22:50.255456 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.255437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/485ab14e-a760-48bc-a701-1e8305f18717-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-b7b485879-52qxz\" (UID: \"485ab14e-a760-48bc-a701-1e8305f18717\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:22:50.392316 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.392255 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:22:50.508013 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.507992 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz"] Apr 22 20:22:50.510433 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:22:50.510410 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod485ab14e_a760_48bc_a701_1e8305f18717.slice/crio-ab0b83f34d09a83a8a2928cb0c2e05eabf624ebb1573b930af5072f806089afa WatchSource:0}: Error finding container ab0b83f34d09a83a8a2928cb0c2e05eabf624ebb1573b930af5072f806089afa: Status 404 returned error can't find the container with id ab0b83f34d09a83a8a2928cb0c2e05eabf624ebb1573b930af5072f806089afa Apr 22 20:22:50.512356 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.512342 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:22:50.821016 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.820986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" event={"ID":"485ab14e-a760-48bc-a701-1e8305f18717","Type":"ContainerStarted","Data":"acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2"} Apr 22 20:22:50.821016 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:50.821023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" event={"ID":"485ab14e-a760-48bc-a701-1e8305f18717","Type":"ContainerStarted","Data":"ab0b83f34d09a83a8a2928cb0c2e05eabf624ebb1573b930af5072f806089afa"} Apr 22 20:22:51.825109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:51.825022 2572 generic.go:358] "Generic (PLEG): container finished" podID="485ab14e-a760-48bc-a701-1e8305f18717" containerID="acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2" exitCode=0 Apr 22 20:22:51.825109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:51.825074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" event={"ID":"485ab14e-a760-48bc-a701-1e8305f18717","Type":"ContainerDied","Data":"acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2"} Apr 22 20:22:52.829737 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:52.829698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" event={"ID":"485ab14e-a760-48bc-a701-1e8305f18717","Type":"ContainerStarted","Data":"8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c"} Apr 22 20:22:52.830169 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:52.829937 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:22:52.831383 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:52.831355 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:22:52.846651 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:52.846605 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podStartSLOduration=2.846591573 podStartE2EDuration="2.846591573s" podCreationTimestamp="2026-04-22 20:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:22:52.84543015 +0000 UTC m=+3587.254844663" watchObservedRunningTime="2026-04-22 20:22:52.846591573 +0000 UTC m=+3587.256006091" Apr 22 20:22:53.173647 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.173627 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:22:53.277421 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.277398 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef2c8de5-7360-4808-a21f-6b8ffcba776b-kserve-provision-location\") pod \"ef2c8de5-7360-4808-a21f-6b8ffcba776b\" (UID: \"ef2c8de5-7360-4808-a21f-6b8ffcba776b\") " Apr 22 20:22:53.277708 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.277688 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef2c8de5-7360-4808-a21f-6b8ffcba776b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ef2c8de5-7360-4808-a21f-6b8ffcba776b" (UID: "ef2c8de5-7360-4808-a21f-6b8ffcba776b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:22:53.377871 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.377816 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef2c8de5-7360-4808-a21f-6b8ffcba776b-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:22:53.835736 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.835700 2572 generic.go:358] "Generic (PLEG): container finished" podID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerID="e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a" exitCode=0 Apr 22 20:22:53.836100 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.835781 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" Apr 22 20:22:53.836100 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.835782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" event={"ID":"ef2c8de5-7360-4808-a21f-6b8ffcba776b","Type":"ContainerDied","Data":"e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a"} Apr 22 20:22:53.836100 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.835818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls" event={"ID":"ef2c8de5-7360-4808-a21f-6b8ffcba776b","Type":"ContainerDied","Data":"c8ff34b92791daa77ccbbbaed8457f43660317e1100ddb919773e5eb111c3327"} Apr 22 20:22:53.836100 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.835849 2572 scope.go:117] "RemoveContainer" containerID="e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a" Apr 22 20:22:53.836499 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.836465 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:22:53.847857 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.847840 2572 scope.go:117] "RemoveContainer" containerID="cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4" Apr 22 20:22:53.854827 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.854809 2572 scope.go:117] "RemoveContainer" containerID="e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a" Apr 22 20:22:53.855095 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:22:53.855076 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a\": container with ID starting with e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a not found: ID does not exist" containerID="e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a" Apr 22 20:22:53.855157 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.855102 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a"} err="failed to get container status \"e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a\": rpc error: code = NotFound desc = could not find container \"e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a\": container with ID starting with e4c8bf3a6fdb6ed092bb964b5b30de5f0f36c08d22ed387aa8b42d84aa05e77a not found: ID does not exist" Apr 22 20:22:53.855157 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.855121 2572 scope.go:117] "RemoveContainer" containerID="cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4" Apr 22 20:22:53.855390 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:22:53.855371 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4\": container with ID starting with cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4 not found: ID does not exist" containerID="cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4" Apr 22 20:22:53.855443 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.855394 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4"} err="failed to get container status \"cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4\": rpc error: code = NotFound desc = could not find container \"cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4\": container with ID starting with cd13b54d68100f5691b5ab949a0a50865b80ebb9141456a61edd34ccc8ea33e4 not found: ID does not exist" Apr 22 20:22:53.860021 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.859989 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls"] Apr 22 20:22:53.863473 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:53.863451 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-fx2ls"] Apr 22 20:22:54.169809 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:22:54.169775 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" path="/var/lib/kubelet/pods/ef2c8de5-7360-4808-a21f-6b8ffcba776b/volumes" Apr 22 20:23:03.836552 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:03.836507 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:23:06.346359 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:06.346334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:23:06.365581 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:06.365558 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:23:13.836847 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:13.836797 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:23:23.836566 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:23.836516 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:23:33.837360 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:33.837309 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:23:43.836922 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:43.836872 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:23:53.837857 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:23:53.837820 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:24:00.190961 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:00.190928 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz"] Apr 22 20:24:00.191532 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:00.191245 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" containerID="cri-o://8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c" gracePeriod=30 Apr 22 20:24:01.515996 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.515966 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd"] Apr 22 20:24:01.516341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.516237 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" Apr 22 20:24:01.516341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.516248 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" Apr 22 20:24:01.516341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.516257 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="storage-initializer" Apr 22 20:24:01.516341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.516263 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="storage-initializer" Apr 22 20:24:01.516341 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.516338 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef2c8de5-7360-4808-a21f-6b8ffcba776b" containerName="kserve-container" Apr 22 20:24:01.519246 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.519228 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.521561 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.521537 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 20:24:01.526762 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.526735 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd"] Apr 22 20:24:01.626769 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.626737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7576afea-4f23-470f-9299-668870cde429-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.626917 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.626794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7576afea-4f23-470f-9299-668870cde429-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.728060 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.728031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7576afea-4f23-470f-9299-668870cde429-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.728179 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.728083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7576afea-4f23-470f-9299-668870cde429-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.728390 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.728375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7576afea-4f23-470f-9299-668870cde429-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.728625 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.728608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7576afea-4f23-470f-9299-668870cde429-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.828991 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.828927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:01.942675 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:01.942641 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd"] Apr 22 20:24:01.945150 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:24:01.945121 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7576afea_4f23_470f_9299_668870cde429.slice/crio-c19c1541fecb4e6fa9f0e247d6fa08f599a556f799978f2e2f45e24e4c2b2d62 WatchSource:0}: Error finding container c19c1541fecb4e6fa9f0e247d6fa08f599a556f799978f2e2f45e24e4c2b2d62: Status 404 returned error can't find the container with id c19c1541fecb4e6fa9f0e247d6fa08f599a556f799978f2e2f45e24e4c2b2d62 Apr 22 20:24:02.028022 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:02.027993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" event={"ID":"7576afea-4f23-470f-9299-668870cde429","Type":"ContainerStarted","Data":"75dd8ae8451ae8b4bf43ecf0951c4f587e78076e3ada6263ca360e3da2985fc9"} Apr 22 20:24:02.028136 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:02.028028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" event={"ID":"7576afea-4f23-470f-9299-668870cde429","Type":"ContainerStarted","Data":"c19c1541fecb4e6fa9f0e247d6fa08f599a556f799978f2e2f45e24e4c2b2d62"} Apr 22 20:24:03.032305 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:03.032268 2572 generic.go:358] "Generic (PLEG): container finished" podID="7576afea-4f23-470f-9299-668870cde429" containerID="75dd8ae8451ae8b4bf43ecf0951c4f587e78076e3ada6263ca360e3da2985fc9" exitCode=0 Apr 22 20:24:03.032720 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:03.032334 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" event={"ID":"7576afea-4f23-470f-9299-668870cde429","Type":"ContainerDied","Data":"75dd8ae8451ae8b4bf43ecf0951c4f587e78076e3ada6263ca360e3da2985fc9"} Apr 22 20:24:03.836879 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:03.836848 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 22 20:24:03.953994 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:03.953971 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:24:04.040777 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.040694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" event={"ID":"7576afea-4f23-470f-9299-668870cde429","Type":"ContainerStarted","Data":"ea4a9b9cc16ce036889bb4aab6ab535d3b81a1f9fd4e582e30b311d8ad957782"} Apr 22 20:24:04.041430 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.040908 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:24:04.042172 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.042135 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:24:04.042289 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.042227 2572 generic.go:358] "Generic (PLEG): container finished" podID="485ab14e-a760-48bc-a701-1e8305f18717" containerID="8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c" exitCode=0 Apr 22 20:24:04.042289 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.042262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" event={"ID":"485ab14e-a760-48bc-a701-1e8305f18717","Type":"ContainerDied","Data":"8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c"} Apr 22 20:24:04.042289 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.042279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" event={"ID":"485ab14e-a760-48bc-a701-1e8305f18717","Type":"ContainerDied","Data":"ab0b83f34d09a83a8a2928cb0c2e05eabf624ebb1573b930af5072f806089afa"} Apr 22 20:24:04.042389 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.042293 2572 scope.go:117] "RemoveContainer" containerID="8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c" Apr 22 20:24:04.042389 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.042348 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz" Apr 22 20:24:04.049245 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.049227 2572 scope.go:117] "RemoveContainer" containerID="acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2" Apr 22 20:24:04.052241 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.052222 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/485ab14e-a760-48bc-a701-1e8305f18717-kserve-provision-location\") pod \"485ab14e-a760-48bc-a701-1e8305f18717\" (UID: \"485ab14e-a760-48bc-a701-1e8305f18717\") " Apr 22 20:24:04.052486 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.052463 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485ab14e-a760-48bc-a701-1e8305f18717-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "485ab14e-a760-48bc-a701-1e8305f18717" (UID: "485ab14e-a760-48bc-a701-1e8305f18717"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:24:04.055999 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.055958 2572 scope.go:117] "RemoveContainer" containerID="8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c" Apr 22 20:24:04.056401 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:24:04.056338 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c\": container with ID starting with 8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c not found: ID does not exist" containerID="8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c" Apr 22 20:24:04.056508 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.056394 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c"} err="failed to get container status \"8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c\": rpc error: code = NotFound desc = could not find container \"8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c\": container with ID starting with 8029a8ae59230b2015807c52dddd7df69734198e4e1da4eedd81bb7fb5da777c not found: ID does not exist" Apr 22 20:24:04.056508 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.056431 2572 scope.go:117] "RemoveContainer" containerID="acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2" Apr 22 20:24:04.056752 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:24:04.056733 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2\": container with ID starting with acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2 not found: ID does not exist" containerID="acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2" Apr 22 20:24:04.056823 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.056761 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2"} err="failed to get container status \"acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2\": rpc error: code = NotFound desc = could not find container \"acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2\": container with ID starting with acb31ca58c348fae3e22332edba5ddc9faa59f894093be5976cb585d0333a1d2 not found: ID does not exist" Apr 22 20:24:04.057653 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.057619 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podStartSLOduration=3.057605978 podStartE2EDuration="3.057605978s" podCreationTimestamp="2026-04-22 20:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:24:04.055251494 +0000 UTC m=+3658.464666008" watchObservedRunningTime="2026-04-22 20:24:04.057605978 +0000 UTC m=+3658.467020492" Apr 22 20:24:04.152918 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.152896 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/485ab14e-a760-48bc-a701-1e8305f18717-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:24:04.357307 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.357253 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz"] Apr 22 20:24:04.361182 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:04.361163 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b7b485879-52qxz"] Apr 22 20:24:05.046919 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:05.046887 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:24:06.170221 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:06.170182 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485ab14e-a760-48bc-a701-1e8305f18717" path="/var/lib/kubelet/pods/485ab14e-a760-48bc-a701-1e8305f18717/volumes" Apr 22 20:24:15.047571 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:15.047531 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:24:25.047925 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:25.047885 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:24:35.047133 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:35.047052 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:24:45.047856 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:45.047819 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:24:55.047611 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:24:55.047569 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:25:05.047979 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:05.047951 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:25:11.575755 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:11.575723 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd"] Apr 22 20:25:11.578082 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:11.575996 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" containerID="cri-o://ea4a9b9cc16ce036889bb4aab6ab535d3b81a1f9fd4e582e30b311d8ad957782" gracePeriod=30 Apr 22 20:25:12.609608 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.609576 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4"] Apr 22 20:25:12.610085 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.610050 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="storage-initializer" Apr 22 20:25:12.610085 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.610069 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="storage-initializer" Apr 22 20:25:12.610213 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.610107 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" Apr 22 20:25:12.610213 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.610116 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" Apr 22 20:25:12.610213 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.610186 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="485ab14e-a760-48bc-a701-1e8305f18717" containerName="kserve-container" Apr 22 20:25:12.613129 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.613110 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:12.619897 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.619876 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4"] Apr 22 20:25:12.704098 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.704071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0dfd93-7360-4a4d-b4ec-baef20d3d42a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4\" (UID: \"de0dfd93-7360-4a4d-b4ec-baef20d3d42a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:12.805420 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.805376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0dfd93-7360-4a4d-b4ec-baef20d3d42a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4\" (UID: \"de0dfd93-7360-4a4d-b4ec-baef20d3d42a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:12.805760 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.805741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0dfd93-7360-4a4d-b4ec-baef20d3d42a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4\" (UID: \"de0dfd93-7360-4a4d-b4ec-baef20d3d42a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:12.923679 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:12.923645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:13.040223 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:13.040185 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4"] Apr 22 20:25:13.043408 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:25:13.043378 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde0dfd93_7360_4a4d_b4ec_baef20d3d42a.slice/crio-b20bcc20e632b141d60a54bd05237fb8188ab6ab10f900aa1ad9c3fb57c5ad9e WatchSource:0}: Error finding container b20bcc20e632b141d60a54bd05237fb8188ab6ab10f900aa1ad9c3fb57c5ad9e: Status 404 returned error can't find the container with id b20bcc20e632b141d60a54bd05237fb8188ab6ab10f900aa1ad9c3fb57c5ad9e Apr 22 20:25:13.240417 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:13.240326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" event={"ID":"de0dfd93-7360-4a4d-b4ec-baef20d3d42a","Type":"ContainerStarted","Data":"322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee"} Apr 22 20:25:13.240417 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:13.240370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" event={"ID":"de0dfd93-7360-4a4d-b4ec-baef20d3d42a","Type":"ContainerStarted","Data":"b20bcc20e632b141d60a54bd05237fb8188ab6ab10f900aa1ad9c3fb57c5ad9e"} Apr 22 20:25:15.047079 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.047042 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 22 20:25:15.247683 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.247637 2572 generic.go:358] "Generic (PLEG): container finished" podID="7576afea-4f23-470f-9299-668870cde429" containerID="ea4a9b9cc16ce036889bb4aab6ab535d3b81a1f9fd4e582e30b311d8ad957782" exitCode=0 Apr 22 20:25:15.247836 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.247719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" event={"ID":"7576afea-4f23-470f-9299-668870cde429","Type":"ContainerDied","Data":"ea4a9b9cc16ce036889bb4aab6ab535d3b81a1f9fd4e582e30b311d8ad957782"} Apr 22 20:25:15.339926 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.339905 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:25:15.426774 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.426748 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7576afea-4f23-470f-9299-668870cde429-kserve-provision-location\") pod \"7576afea-4f23-470f-9299-668870cde429\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " Apr 22 20:25:15.426894 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.426791 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7576afea-4f23-470f-9299-668870cde429-cabundle-cert\") pod \"7576afea-4f23-470f-9299-668870cde429\" (UID: \"7576afea-4f23-470f-9299-668870cde429\") " Apr 22 20:25:15.427037 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.427015 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7576afea-4f23-470f-9299-668870cde429-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7576afea-4f23-470f-9299-668870cde429" (UID: "7576afea-4f23-470f-9299-668870cde429"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:15.427099 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.427078 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7576afea-4f23-470f-9299-668870cde429-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "7576afea-4f23-470f-9299-668870cde429" (UID: "7576afea-4f23-470f-9299-668870cde429"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:25:15.528241 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.528207 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7576afea-4f23-470f-9299-668870cde429-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:25:15.528241 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:15.528237 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7576afea-4f23-470f-9299-668870cde429-cabundle-cert\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:25:16.252068 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:16.252042 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" Apr 22 20:25:16.252068 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:16.252050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd" event={"ID":"7576afea-4f23-470f-9299-668870cde429","Type":"ContainerDied","Data":"c19c1541fecb4e6fa9f0e247d6fa08f599a556f799978f2e2f45e24e4c2b2d62"} Apr 22 20:25:16.252574 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:16.252096 2572 scope.go:117] "RemoveContainer" containerID="ea4a9b9cc16ce036889bb4aab6ab535d3b81a1f9fd4e582e30b311d8ad957782" Apr 22 20:25:16.260534 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:16.260522 2572 scope.go:117] "RemoveContainer" containerID="75dd8ae8451ae8b4bf43ecf0951c4f587e78076e3ada6263ca360e3da2985fc9" Apr 22 20:25:16.268835 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:16.268818 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd"] Apr 22 20:25:16.273163 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:16.273142 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-8544dfb646-lrtkd"] Apr 22 20:25:17.256330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:17.256304 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4_de0dfd93-7360-4a4d-b4ec-baef20d3d42a/storage-initializer/0.log" Apr 22 20:25:17.256819 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:17.256340 2572 generic.go:358] "Generic (PLEG): container finished" podID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerID="322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee" exitCode=1 Apr 22 20:25:17.256819 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:17.256399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" event={"ID":"de0dfd93-7360-4a4d-b4ec-baef20d3d42a","Type":"ContainerDied","Data":"322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee"} Apr 22 20:25:18.169971 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:18.169934 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7576afea-4f23-470f-9299-668870cde429" path="/var/lib/kubelet/pods/7576afea-4f23-470f-9299-668870cde429/volumes" Apr 22 20:25:18.261613 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:18.261593 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4_de0dfd93-7360-4a4d-b4ec-baef20d3d42a/storage-initializer/0.log" Apr 22 20:25:18.261997 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:18.261645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" event={"ID":"de0dfd93-7360-4a4d-b4ec-baef20d3d42a","Type":"ContainerStarted","Data":"55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398"} Apr 22 20:25:22.620102 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:22.620062 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4"] Apr 22 20:25:22.620631 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:22.620350 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" containerID="cri-o://55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398" gracePeriod=30 Apr 22 20:25:23.650040 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:23.650018 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4_de0dfd93-7360-4a4d-b4ec-baef20d3d42a/storage-initializer/1.log" Apr 22 20:25:23.650375 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:23.650360 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4_de0dfd93-7360-4a4d-b4ec-baef20d3d42a/storage-initializer/0.log" Apr 22 20:25:23.650478 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:23.650420 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:23.790303 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:23.790243 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0dfd93-7360-4a4d-b4ec-baef20d3d42a-kserve-provision-location\") pod \"de0dfd93-7360-4a4d-b4ec-baef20d3d42a\" (UID: \"de0dfd93-7360-4a4d-b4ec-baef20d3d42a\") " Apr 22 20:25:23.790492 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:23.790473 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de0dfd93-7360-4a4d-b4ec-baef20d3d42a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de0dfd93-7360-4a4d-b4ec-baef20d3d42a" (UID: "de0dfd93-7360-4a4d-b4ec-baef20d3d42a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:23.891119 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:23.891098 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0dfd93-7360-4a4d-b4ec-baef20d3d42a-kserve-provision-location\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:25:24.278942 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.278921 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4_de0dfd93-7360-4a4d-b4ec-baef20d3d42a/storage-initializer/1.log" Apr 22 20:25:24.279293 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.279279 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4_de0dfd93-7360-4a4d-b4ec-baef20d3d42a/storage-initializer/0.log" Apr 22 20:25:24.279338 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.279311 2572 generic.go:358] "Generic (PLEG): container finished" podID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerID="55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398" exitCode=1 Apr 22 20:25:24.279398 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.279381 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" Apr 22 20:25:24.279477 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.279385 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" event={"ID":"de0dfd93-7360-4a4d-b4ec-baef20d3d42a","Type":"ContainerDied","Data":"55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398"} Apr 22 20:25:24.279514 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.279494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4" event={"ID":"de0dfd93-7360-4a4d-b4ec-baef20d3d42a","Type":"ContainerDied","Data":"b20bcc20e632b141d60a54bd05237fb8188ab6ab10f900aa1ad9c3fb57c5ad9e"} Apr 22 20:25:24.279514 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.279511 2572 scope.go:117] "RemoveContainer" containerID="55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398" Apr 22 20:25:24.286832 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.286814 2572 scope.go:117] "RemoveContainer" containerID="322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee" Apr 22 20:25:24.293196 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.293173 2572 scope.go:117] "RemoveContainer" containerID="55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398" Apr 22 20:25:24.293441 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:25:24.293421 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398\": container with ID starting with 55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398 not found: ID does not exist" containerID="55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398" Apr 22 20:25:24.293493 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.293449 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398"} err="failed to get container status \"55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398\": rpc error: code = NotFound desc = could not find container \"55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398\": container with ID starting with 55f2db288e13347c09432792e7f3e4e6303617d0f3c08d7720fe09a536c52398 not found: ID does not exist" Apr 22 20:25:24.293493 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.293466 2572 scope.go:117] "RemoveContainer" containerID="322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee" Apr 22 20:25:24.293704 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:25:24.293685 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee\": container with ID starting with 322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee not found: ID does not exist" containerID="322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee" Apr 22 20:25:24.293751 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.293711 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee"} err="failed to get container status \"322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee\": rpc error: code = NotFound desc = could not find container \"322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee\": container with ID starting with 322d9c7db45d7a362c9ba4982353d15b8df034a7808c2fe2c83dee42c4c74fee not found: ID does not exist" Apr 22 20:25:24.308991 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.308969 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4"] Apr 22 20:25:24.312268 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.312248 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-778977bddd-698b4"] Apr 22 20:25:24.642196 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642130 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zcf5d/must-gather-mlp74"] Apr 22 20:25:24.642441 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642429 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" Apr 22 20:25:24.642481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642443 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" Apr 22 20:25:24.642481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642456 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" Apr 22 20:25:24.642481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642462 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" Apr 22 20:25:24.642481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642470 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" Apr 22 20:25:24.642481 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642475 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" Apr 22 20:25:24.642630 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642485 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="storage-initializer" Apr 22 20:25:24.642630 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642491 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="storage-initializer" Apr 22 20:25:24.642630 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642537 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" Apr 22 20:25:24.642630 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642546 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7576afea-4f23-470f-9299-668870cde429" containerName="kserve-container" Apr 22 20:25:24.642630 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.642553 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" containerName="storage-initializer" Apr 22 20:25:24.646608 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.646593 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.649035 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.649015 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zcf5d\"/\"default-dockercfg-d85ll\"" Apr 22 20:25:24.649144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.649015 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zcf5d\"/\"openshift-service-ca.crt\"" Apr 22 20:25:24.649144 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.649083 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zcf5d\"/\"kube-root-ca.crt\"" Apr 22 20:25:24.654185 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.654164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zcf5d/must-gather-mlp74"] Apr 22 20:25:24.796793 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.796763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-must-gather-output\") pod \"must-gather-mlp74\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.796911 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.796829 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qpnb\" (UniqueName: \"kubernetes.io/projected/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-kube-api-access-7qpnb\") pod \"must-gather-mlp74\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.897534 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.897463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qpnb\" (UniqueName: \"kubernetes.io/projected/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-kube-api-access-7qpnb\") pod \"must-gather-mlp74\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.897534 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.897525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-must-gather-output\") pod \"must-gather-mlp74\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.897832 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.897817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-must-gather-output\") pod \"must-gather-mlp74\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.905031 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.905010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qpnb\" (UniqueName: \"kubernetes.io/projected/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-kube-api-access-7qpnb\") pod \"must-gather-mlp74\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:24.967613 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:24.967596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:25.081078 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:25.081042 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zcf5d/must-gather-mlp74"] Apr 22 20:25:25.083234 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:25:25.083209 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b3e77ee_cf6c_4469_b9b5_0cad5b6ef6fc.slice/crio-8335f20adceaf334bb457f2b4cf33cc301244a9bc992f99b917ba15f322fa404 WatchSource:0}: Error finding container 8335f20adceaf334bb457f2b4cf33cc301244a9bc992f99b917ba15f322fa404: Status 404 returned error can't find the container with id 8335f20adceaf334bb457f2b4cf33cc301244a9bc992f99b917ba15f322fa404 Apr 22 20:25:25.283320 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:25.283291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zcf5d/must-gather-mlp74" event={"ID":"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc","Type":"ContainerStarted","Data":"8335f20adceaf334bb457f2b4cf33cc301244a9bc992f99b917ba15f322fa404"} Apr 22 20:25:26.171495 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:26.171436 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0dfd93-7360-4a4d-b4ec-baef20d3d42a" path="/var/lib/kubelet/pods/de0dfd93-7360-4a4d-b4ec-baef20d3d42a/volumes" Apr 22 20:25:30.301205 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:30.301173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zcf5d/must-gather-mlp74" event={"ID":"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc","Type":"ContainerStarted","Data":"fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f"} Apr 22 20:25:30.301205 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:30.301206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zcf5d/must-gather-mlp74" event={"ID":"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc","Type":"ContainerStarted","Data":"d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303"} Apr 22 20:25:30.317884 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:30.317840 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zcf5d/must-gather-mlp74" podStartSLOduration=2.021124881 podStartE2EDuration="6.317826451s" podCreationTimestamp="2026-04-22 20:25:24 +0000 UTC" firstStartedPulling="2026-04-22 20:25:25.08489664 +0000 UTC m=+3739.494311131" lastFinishedPulling="2026-04-22 20:25:29.38159821 +0000 UTC m=+3743.791012701" observedRunningTime="2026-04-22 20:25:30.315842881 +0000 UTC m=+3744.725257393" watchObservedRunningTime="2026-04-22 20:25:30.317826451 +0000 UTC m=+3744.727240964" Apr 22 20:25:50.362792 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:50.362704 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerID="d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303" exitCode=0 Apr 22 20:25:50.363183 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:50.362784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zcf5d/must-gather-mlp74" event={"ID":"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc","Type":"ContainerDied","Data":"d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303"} Apr 22 20:25:50.363183 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:50.363072 2572 scope.go:117] "RemoveContainer" containerID="d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303" Apr 22 20:25:50.984122 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:50.984084 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zcf5d_must-gather-mlp74_4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc/gather/0.log" Apr 22 20:25:54.408141 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:54.408103 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-84wfc_91f11754-ebaa-489c-8dad-189955ae35aa/global-pull-secret-syncer/0.log" Apr 22 20:25:54.644632 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:54.644603 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hqv99_1dd58df5-bc2c-432a-844a-887a587be426/konnectivity-agent/0.log" Apr 22 20:25:54.751021 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:54.750955 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-231.ec2.internal_52cc10a5c9c1a625e58617dcad0f895c/haproxy/0.log" Apr 22 20:25:56.516546 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.516475 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zcf5d/must-gather-mlp74"] Apr 22 20:25:56.516903 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.516723 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-zcf5d/must-gather-mlp74" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="copy" containerID="cri-o://fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f" gracePeriod=2 Apr 22 20:25:56.521075 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.521049 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zcf5d/must-gather-mlp74"] Apr 22 20:25:56.732330 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.732310 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zcf5d_must-gather-mlp74_4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc/copy/0.log" Apr 22 20:25:56.732621 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.732606 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:56.734685 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.734651 2572 status_manager.go:895] "Failed to get status for pod" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" pod="openshift-must-gather-zcf5d/must-gather-mlp74" err="pods \"must-gather-mlp74\" is forbidden: User \"system:node:ip-10-0-134-231.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zcf5d\": no relationship found between node 'ip-10-0-134-231.ec2.internal' and this object" Apr 22 20:25:56.846723 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.846655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qpnb\" (UniqueName: \"kubernetes.io/projected/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-kube-api-access-7qpnb\") pod \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " Apr 22 20:25:56.846808 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.846742 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-must-gather-output\") pod \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\" (UID: \"4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc\") " Apr 22 20:25:56.848111 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.848086 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" (UID: "4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:56.848726 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.848711 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-kube-api-access-7qpnb" (OuterVolumeSpecName: "kube-api-access-7qpnb") pod "4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" (UID: "4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc"). InnerVolumeSpecName "kube-api-access-7qpnb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:25:56.947908 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.947885 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-must-gather-output\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:25:56.947908 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:56.947904 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qpnb\" (UniqueName: \"kubernetes.io/projected/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc-kube-api-access-7qpnb\") on node \"ip-10-0-134-231.ec2.internal\" DevicePath \"\"" Apr 22 20:25:57.386084 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.386059 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zcf5d_must-gather-mlp74_4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc/copy/0.log" Apr 22 20:25:57.386453 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.386428 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerID="fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f" exitCode=143 Apr 22 20:25:57.386538 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.386497 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zcf5d/must-gather-mlp74" Apr 22 20:25:57.386585 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.386535 2572 scope.go:117] "RemoveContainer" containerID="fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f" Apr 22 20:25:57.389030 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.388996 2572 status_manager.go:895] "Failed to get status for pod" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" pod="openshift-must-gather-zcf5d/must-gather-mlp74" err="pods \"must-gather-mlp74\" is forbidden: User \"system:node:ip-10-0-134-231.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zcf5d\": no relationship found between node 'ip-10-0-134-231.ec2.internal' and this object" Apr 22 20:25:57.394592 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.394572 2572 scope.go:117] "RemoveContainer" containerID="d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303" Apr 22 20:25:57.396499 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.396475 2572 status_manager.go:895] "Failed to get status for pod" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" pod="openshift-must-gather-zcf5d/must-gather-mlp74" err="pods \"must-gather-mlp74\" is forbidden: User \"system:node:ip-10-0-134-231.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zcf5d\": no relationship found between node 'ip-10-0-134-231.ec2.internal' and this object" Apr 22 20:25:57.406118 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.406100 2572 scope.go:117] "RemoveContainer" containerID="fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f" Apr 22 20:25:57.406353 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:25:57.406335 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f\": container with ID starting with fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f not found: ID does not exist" containerID="fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f" Apr 22 20:25:57.406407 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.406360 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f"} err="failed to get container status \"fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f\": rpc error: code = NotFound desc = could not find container \"fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f\": container with ID starting with fb9c286874b4469f0284124d9255c65f37a16326b848770575b96deff1b5e02f not found: ID does not exist" Apr 22 20:25:57.406407 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.406379 2572 scope.go:117] "RemoveContainer" containerID="d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303" Apr 22 20:25:57.406615 ip-10-0-134-231 kubenswrapper[2572]: E0422 20:25:57.406599 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303\": container with ID starting with d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303 not found: ID does not exist" containerID="d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303" Apr 22 20:25:57.406757 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:57.406617 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303"} err="failed to get container status \"d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303\": rpc error: code = NotFound desc = could not find container \"d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303\": container with ID starting with d99b6d7be02068961e0207247749ca9dfdda636c13acae713347bb9d3fef0303 not found: ID does not exist" Apr 22 20:25:58.169629 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.169600 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" path="/var/lib/kubelet/pods/4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc/volumes" Apr 22 20:25:58.393437 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.393409 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-95jw8_7782c208-eb03-4951-80cf-8089fcbf8cb4/cluster-monitoring-operator/0.log" Apr 22 20:25:58.415657 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.415616 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mc82c_31037d76-505c-4078-9baf-70abb3048ac9/kube-state-metrics/0.log" Apr 22 20:25:58.438090 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.438042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mc82c_31037d76-505c-4078-9baf-70abb3048ac9/kube-rbac-proxy-main/0.log" Apr 22 20:25:58.460076 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.460060 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mc82c_31037d76-505c-4078-9baf-70abb3048ac9/kube-rbac-proxy-self/0.log" Apr 22 20:25:58.542585 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.542563 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dc29n_bff67443-d8bc-4920-819f-6c767147500e/node-exporter/0.log" Apr 22 20:25:58.562398 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.562381 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dc29n_bff67443-d8bc-4920-819f-6c767147500e/kube-rbac-proxy/0.log" Apr 22 20:25:58.583630 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.583614 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dc29n_bff67443-d8bc-4920-819f-6c767147500e/init-textfile/0.log" Apr 22 20:25:58.848615 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.848556 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/prometheus/0.log" Apr 22 20:25:58.865886 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.865869 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/config-reloader/0.log" Apr 22 20:25:58.890125 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.890098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/thanos-sidecar/0.log" Apr 22 20:25:58.910122 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.910101 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/kube-rbac-proxy-web/0.log" Apr 22 20:25:58.929943 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.929926 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/kube-rbac-proxy/0.log" Apr 22 20:25:58.950696 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.950656 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/kube-rbac-proxy-thanos/0.log" Apr 22 20:25:58.977063 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:25:58.977038 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_826e6e59-5db2-4b03-9772-c97128b47870/init-config-reloader/0.log" Apr 22 20:26:00.802949 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:00.802917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/1.log" Apr 22 20:26:00.811559 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:00.811538 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-294vl_94a5da67-cf92-4ff3-ace9-bb2d6bf8bcf1/console-operator/2.log" Apr 22 20:26:01.406736 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.406708 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9"] Apr 22 20:26:01.406984 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.406971 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="gather" Apr 22 20:26:01.406984 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.406983 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="gather" Apr 22 20:26:01.407118 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.406999 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="copy" Apr 22 20:26:01.407118 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.407005 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="copy" Apr 22 20:26:01.407118 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.407075 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="copy" Apr 22 20:26:01.407118 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.407083 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b3e77ee-cf6c-4469-b9b5-0cad5b6ef6fc" containerName="gather" Apr 22 20:26:01.412089 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.412065 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.414443 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.414424 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9c5pk\"/\"kube-root-ca.crt\"" Apr 22 20:26:01.415888 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.415496 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9c5pk\"/\"default-dockercfg-lz96s\"" Apr 22 20:26:01.415888 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.415545 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9c5pk\"/\"openshift-service-ca.crt\"" Apr 22 20:26:01.417915 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.417896 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9"] Apr 22 20:26:01.478438 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.478416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-proc\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.478533 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.478445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvcr\" (UniqueName: \"kubernetes.io/projected/ea98c405-a083-4c61-b94a-710aaaa50523-kube-api-access-4jvcr\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.478533 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.478465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-sys\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.478533 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.478496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-lib-modules\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.478656 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.478558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-podres\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579335 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-proc\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579474 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvcr\" (UniqueName: \"kubernetes.io/projected/ea98c405-a083-4c61-b94a-710aaaa50523-kube-api-access-4jvcr\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579474 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-sys\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579474 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-sys\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579474 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-proc\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579474 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-lib-modules\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579698 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-podres\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579698 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579642 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-podres\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.579698 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.579657 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea98c405-a083-4c61-b94a-710aaaa50523-lib-modules\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.586932 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.586915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvcr\" (UniqueName: \"kubernetes.io/projected/ea98c405-a083-4c61-b94a-710aaaa50523-kube-api-access-4jvcr\") pod \"perf-node-gather-daemonset-wqcq9\" (UID: \"ea98c405-a083-4c61-b94a-710aaaa50523\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.723767 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.723712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:01.839295 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:01.839269 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9"] Apr 22 20:26:01.842270 ip-10-0-134-231 kubenswrapper[2572]: W0422 20:26:01.842241 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podea98c405_a083_4c61_b94a_710aaaa50523.slice/crio-3bf0b1af05ad0b1eab7cba5dfabf0db03ba8c0066a53f3d873e88d3999e9a32f WatchSource:0}: Error finding container 3bf0b1af05ad0b1eab7cba5dfabf0db03ba8c0066a53f3d873e88d3999e9a32f: Status 404 returned error can't find the container with id 3bf0b1af05ad0b1eab7cba5dfabf0db03ba8c0066a53f3d873e88d3999e9a32f Apr 22 20:26:02.307411 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.307388 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cr9b2_b3237a2b-dfbc-4c60-8166-c94d61b4467f/dns/0.log" Apr 22 20:26:02.330713 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.330691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cr9b2_b3237a2b-dfbc-4c60-8166-c94d61b4467f/kube-rbac-proxy/0.log" Apr 22 20:26:02.402820 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.402792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" event={"ID":"ea98c405-a083-4c61-b94a-710aaaa50523","Type":"ContainerStarted","Data":"1a479b85a805c0de8980d074f4afcd3f1d0d868813c55124f2ead86f7defb38b"} Apr 22 20:26:02.402820 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.402822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" event={"ID":"ea98c405-a083-4c61-b94a-710aaaa50523","Type":"ContainerStarted","Data":"3bf0b1af05ad0b1eab7cba5dfabf0db03ba8c0066a53f3d873e88d3999e9a32f"} Apr 22 20:26:02.402997 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.402916 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:02.421434 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.421381 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" podStartSLOduration=1.42136983 podStartE2EDuration="1.42136983s" podCreationTimestamp="2026-04-22 20:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:26:02.419964124 +0000 UTC m=+3776.829378653" watchObservedRunningTime="2026-04-22 20:26:02.42136983 +0000 UTC m=+3776.830784342" Apr 22 20:26:02.441015 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.440995 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zv45q_2c867e00-703c-41c3-8964-4eee7b3451c9/dns-node-resolver/0.log" Apr 22 20:26:02.904109 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:02.904079 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d9v44_1130a4f2-a77f-4484-b9a9-046f3553e57b/node-ca/0.log" Apr 22 20:26:03.552876 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:03.552849 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74d655c858-jl4ng_c452dcdb-3433-4c9c-a0e5-3f3bea4be3d5/router/0.log" Apr 22 20:26:03.882016 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:03.881988 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bv9cr_75bc368d-1a1a-4f77-9a39-1a1b256f1eb6/serve-healthcheck-canary/0.log" Apr 22 20:26:04.277897 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:04.277870 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j72zs_64c67afe-5ac1-489a-897a-9fcfbfb51c30/kube-rbac-proxy/0.log" Apr 22 20:26:04.298287 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:04.298262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j72zs_64c67afe-5ac1-489a-897a-9fcfbfb51c30/exporter/0.log" Apr 22 20:26:04.319609 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:04.319588 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j72zs_64c67afe-5ac1-489a-897a-9fcfbfb51c30/extractor/0.log" Apr 22 20:26:06.488092 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:06.488069 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-k5hv4_23afe627-399d-4e32-9760-bac458db5ba8/server/0.log" Apr 22 20:26:06.888608 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:06.888470 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-h5lwd_fbca928d-0eee-4a59-a378-4bcc47205b07/s3-init/0.log" Apr 22 20:26:06.910460 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:06.910441 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-vn9ng_4ef1cde4-40cc-4920-9853-ffcd9ebc7560/s3-tls-init-custom/0.log" Apr 22 20:26:06.932036 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:06.932016 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-fdt9x_8e5715bb-8ec7-4f65-b10e-08a16b7d6a3f/s3-tls-init-serving/0.log" Apr 22 20:26:08.416243 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:08.416215 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-wqcq9" Apr 22 20:26:10.881308 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:10.881287 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-s9hpf_d96bb751-7faf-40c6-b20a-4ce5e5009c3e/migrator/0.log" Apr 22 20:26:10.910137 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:10.910116 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-s9hpf_d96bb751-7faf-40c6-b20a-4ce5e5009c3e/graceful-termination/0.log" Apr 22 20:26:12.447150 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.447127 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/kube-multus-additional-cni-plugins/0.log" Apr 22 20:26:12.467691 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.467652 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/egress-router-binary-copy/0.log" Apr 22 20:26:12.488828 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.488802 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/cni-plugins/0.log" Apr 22 20:26:12.510397 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.510379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/bond-cni-plugin/0.log" Apr 22 20:26:12.531961 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.531944 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/routeoverride-cni/0.log" Apr 22 20:26:12.555625 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.555607 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/whereabouts-cni-bincopy/0.log" Apr 22 20:26:12.576022 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.576002 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gxnvz_fc27d5a5-d1ec-4f19-823f-585b5366a986/whereabouts-cni/0.log" Apr 22 20:26:12.644045 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.643994 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xt29w_453fe14e-87da-4452-87b2-f3814fcb9406/kube-multus/0.log" Apr 22 20:26:12.812161 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.812139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-w244z_def7cd86-6b79-4c5f-900f-f09644520b6b/network-metrics-daemon/0.log" Apr 22 20:26:12.829838 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:12.829819 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-w244z_def7cd86-6b79-4c5f-900f-f09644520b6b/kube-rbac-proxy/0.log" Apr 22 20:26:13.527835 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.527788 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/ovn-controller/0.log" Apr 22 20:26:13.577204 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.577180 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/ovn-acl-logging/0.log" Apr 22 20:26:13.601580 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.601559 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/kube-rbac-proxy-node/0.log" Apr 22 20:26:13.624279 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.624239 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:26:13.641618 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.641599 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/northd/0.log" Apr 22 20:26:13.661214 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.661192 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/nbdb/0.log" Apr 22 20:26:13.684938 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.684917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/sbdb/0.log" Apr 22 20:26:13.871409 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:13.871356 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jbww_19a311dd-1aa9-4326-8424-accc5f5f330c/ovnkube-controller/0.log" Apr 22 20:26:15.662419 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:15.662390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qptxp_90bba156-f068-4b96-a366-bae94c48b2b6/network-check-target-container/0.log" Apr 22 20:26:16.580210 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:16.580177 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vvjql_623abfa5-3755-40bb-bd0c-3fbea347ccc5/iptables-alerter/0.log" Apr 22 20:26:17.260759 ip-10-0-134-231 kubenswrapper[2572]: I0422 20:26:17.260725 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-v2tph_4d1c5d77-dc78-43c4-92de-a6f2ba9c71ef/tuned/0.log"