Apr 17 07:48:36.913531 ip-10-0-138-143 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:48:36.913546 ip-10-0-138-143 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:48:36.913556 ip-10-0-138-143 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:48:36.913860 ip-10-0-138-143 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:48:47.004272 ip-10-0-138-143 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:48:47.004290 ip-10-0-138-143 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b560ed2323ab42449c74e1fa663344c5 -- Apr 17 07:51:12.554708 ip-10-0-138-143 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:12.935464 ip-10-0-138-143 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:12.935464 ip-10-0-138-143 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:12.935464 ip-10-0-138-143 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:12.935464 ip-10-0-138-143 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:12.935464 ip-10-0-138-143 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:12.936955 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.936876 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:12.939150 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939136 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:12.939150 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939151 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939155 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939158 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939161 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939164 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939167 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939170 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939173 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939176 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939179 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939181 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939184 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939187 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939189 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939192 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939195 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939198 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939200 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939203 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939205 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:12.939214 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939208 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939211 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939213 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939216 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939225 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939228 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939231 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939233 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939236 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939238 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939242 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939246 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939249 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939252 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939254 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939257 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939259 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939262 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939264 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:12.939700 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939267 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939270 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939272 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939274 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939277 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939279 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939282 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939284 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939286 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939289 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939291 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939294 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939297 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939299 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939302 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939306 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939309 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939312 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939314 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:12.940149 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939317 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939320 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939322 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939326 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939328 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939331 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939334 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939337 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939340 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939345 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939348 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939350 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939354 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939357 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939359 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939362 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939365 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939367 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939370 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939372 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:12.940650 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939389 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939391 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939394 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939397 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939399 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939402 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939405 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939812 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939818 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939821 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939824 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939827 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939830 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939833 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939836 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939839 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939842 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939845 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939847 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939849 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:12.941250 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939852 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939855 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939857 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939860 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939862 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939865 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939867 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939870 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939873 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939875 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939878 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939881 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939884 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939886 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939889 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939892 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939894 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939897 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939899 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939901 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:12.941754 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939904 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939907 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939910 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939912 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939916 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939919 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939922 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939925 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939927 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939930 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939932 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939935 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939937 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939940 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939943 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939945 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939948 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939950 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939953 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:12.942243 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939955 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939958 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939960 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939963 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939966 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939969 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939971 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939973 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939976 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939978 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939981 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939984 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939986 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939989 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939991 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939994 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939996 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.939999 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940001 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940004 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:12.942727 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940007 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940009 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940011 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940014 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940017 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940020 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940024 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940027 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940030 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940033 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940035 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940038 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940040 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940043 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940114 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940121 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940128 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940133 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940137 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940140 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940144 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:12.943207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940149 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940152 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940159 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940163 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940166 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940169 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940172 2573 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940175 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940178 2573 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940181 2573 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940183 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940186 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940190 2573 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940193 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940196 2573 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940198 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940202 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940205 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940208 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940211 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940214 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940217 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940220 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940223 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940226 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:12.943733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940229 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940233 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940237 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940240 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940243 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940246 2573 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940249 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940254 2573 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940257 2573 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940260 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940263 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940266 2573 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940270 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940273 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940276 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940279 2573 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940281 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940284 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940287 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940291 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940294 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940297 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940299 2573 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940303 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940306 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:12.944313 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940309 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940312 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940315 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940318 2573 flags.go:64] FLAG: --help="false" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940321 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-138-143.ec2.internal" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940324 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940327 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940329 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940333 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940336 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940340 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940343 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940345 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940348 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940351 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940354 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940358 2573 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940361 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940364 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940367 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940370 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940372 2573 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940389 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940393 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:12.944935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940396 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940401 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940404 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940407 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940409 2573 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940412 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940416 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940419 2573 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940421 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940426 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940428 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940432 2573 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940435 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940438 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940441 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940444 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940447 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940450 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940453 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940460 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940463 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940466 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940469 2573 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:12.945524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940472 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940480 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940483 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940486 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940490 2573 flags.go:64] FLAG: --port="10250" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940493 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940495 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-022f24371931f1bcc" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940498 2573 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940501 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940505 2573 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940507 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940510 2573 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940514 2573 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940517 2573 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940520 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940523 2573 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940526 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940529 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940532 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940535 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940538 2573 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940541 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940544 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940547 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940549 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940552 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:12.946117 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940555 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940558 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940561 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940564 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940567 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940570 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940572 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940577 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940580 2573 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940583 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940588 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940591 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940594 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940597 2573 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940600 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940603 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940606 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940609 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940612 2573 flags.go:64] FLAG: --v="2" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940616 2573 flags.go:64] FLAG: --version="false" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940620 2573 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940624 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.940627 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940734 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:12.946770 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940738 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940741 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940744 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940747 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940750 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940753 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940757 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940761 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940764 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940767 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940769 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940772 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940781 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940784 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940787 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940794 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940798 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940801 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940804 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940807 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:12.947331 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940810 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940813 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940815 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940818 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940820 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940823 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940826 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940828 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940831 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940833 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940836 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940839 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940841 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940844 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940846 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940849 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940851 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940854 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940856 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940859 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:12.947857 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940861 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940863 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940866 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940869 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940871 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940881 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940884 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940888 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940892 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940895 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940897 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940900 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940902 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940905 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940908 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940910 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940913 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940916 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940918 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:12.948347 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940920 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940923 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940925 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940928 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940931 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940933 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940936 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940938 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940941 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940943 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940946 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940948 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940950 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940953 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940955 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940958 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940960 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940963 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940965 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940969 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:12.948825 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940973 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940977 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940980 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940982 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940985 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.940988 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.941678 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.947892 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.947907 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947958 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947964 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947967 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947970 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947973 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947976 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947980 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:12.949341 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947983 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947986 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947988 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947991 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947994 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947997 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.947999 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948002 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948005 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948007 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948010 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948012 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948015 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948017 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948020 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948023 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948026 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948029 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948031 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948034 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:12.949806 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948037 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948039 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948042 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948044 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948047 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948050 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948052 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948055 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948058 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948060 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948063 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948066 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948068 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948071 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948073 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948077 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948081 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948085 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948087 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:12.950285 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948090 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948093 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948095 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948098 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948100 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948103 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948105 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948116 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948120 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948123 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948126 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948129 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948133 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948136 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948140 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948143 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948145 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948152 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948155 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948157 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:12.950785 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948160 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948162 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948165 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948167 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948170 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948173 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948175 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948178 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948180 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948183 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948185 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948188 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948190 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948193 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948195 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948197 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948200 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948202 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948205 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:12.951272 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948208 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.948212 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948308 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948314 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948317 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948320 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948323 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948326 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948329 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948331 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948334 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948337 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948339 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948342 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948344 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948347 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:12.951752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948350 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948352 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948356 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948360 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948364 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948367 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948370 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948372 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948389 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948392 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948394 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948397 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948399 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948402 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948404 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948407 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948409 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948412 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948414 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:12.952147 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948417 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948420 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948423 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948426 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948428 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948431 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948433 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948437 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948439 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948442 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948445 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948448 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948450 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948452 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948455 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948457 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948460 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948462 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948465 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:12.952643 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948468 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948470 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948473 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948475 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948478 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948480 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948483 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948485 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948488 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948491 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948494 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948496 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948499 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948502 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948505 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948508 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948511 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948513 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948516 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948519 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:12.953108 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948521 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948524 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948526 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948529 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948531 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948534 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948536 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948539 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948541 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948544 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948546 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948549 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948551 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:12.948554 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.948558 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:12.953604 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.949218 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:12.954080 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.954067 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:12.954874 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.954862 2573 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:12.954978 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.954962 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:12.955008 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.955001 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:12.976783 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.976766 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:12.981427 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.981303 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:12.996331 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:12.996310 2573 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:13.001676 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.001661 2573 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:13.003519 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.003504 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:13.006391 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.006357 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7e9ef57b-9080-4ced-96c7-2b11762a99b3:/dev/nvme0n1p4 c9775989-e068-4e7f-9b77-2ce65c130545:/dev/nvme0n1p3] Apr 17 07:51:13.006472 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.006392 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:13.007860 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.007844 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:13.011239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.011128 2573 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:13.009921223 +0000 UTC m=+0.353172677 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099802 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20c72397d6691bbde353a031f86e1f SystemUUID:ec20c723-97d6-691b-bde3-53a031f86e1f BootID:b560ed23-23ab-4244-9c74-e1fa663344c5 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9c:7c:21:da:07 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9c:7c:21:da:07 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:2e:24:eb:1e:59 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:13.011239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.011232 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:13.011365 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.011303 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:13.013568 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.013544 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:13.013702 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.013571 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-143.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:13.013748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.013711 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:13.013748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.013719 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:13.013748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.013731 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:13.013748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.013747 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:13.015179 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.015168 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:13.015284 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.015276 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:13.017802 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.017793 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:13.017845 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.017810 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:13.018437 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.018427 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:13.018437 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.018439 2573 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:13.018513 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.018448 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:13.019496 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.019484 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:13.019534 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.019505 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:13.021974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.021959 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:13.023719 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.023706 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:13.024952 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024939 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024956 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024962 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024967 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024972 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024978 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024983 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024988 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.024995 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.025001 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:13.025013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.025009 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:13.025276 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.025018 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:13.025857 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.025847 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:13.025857 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.025857 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:13.029209 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.029194 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:13.029283 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.029227 2573 server.go:1295] "Started kubelet" Apr 17 07:51:13.029354 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.029312 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:13.029499 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.029430 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:13.029563 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.029524 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:13.029960 ip-10-0-138-143 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:13.032933 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.032908 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:13.033057 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.033031 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-143.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:51:13.033108 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.033097 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-143.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 07:51:13.033333 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.033294 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:51:13.034279 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.034259 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:13.039873 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.038826 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-143.ec2.internal.18a7158cec8cce2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-143.ec2.internal,UID:ip-10-0-138-143.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-143.ec2.internal,},FirstTimestamp:2026-04-17 07:51:13.029205546 +0000 UTC m=+0.372457002,LastTimestamp:2026-04-17 07:51:13.029205546 +0000 UTC m=+0.372457002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-143.ec2.internal,}" Apr 17 07:51:13.041258 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.041242 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:13.041352 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.041267 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:13.041993 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.041971 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:51:13.042090 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.042069 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.042090 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042080 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:13.042221 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042080 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:13.042221 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042106 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:13.042221 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042198 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:13.042221 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042204 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:13.042715 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042699 2573 factory.go:55] Registering systemd factory Apr 17 07:51:13.042804 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.042796 2573 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:13.043477 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.043457 2573 factory.go:153] Registering CRI-O factory Apr 17 07:51:13.043559 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.043479 2573 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:13.043559 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.043531 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:13.043559 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.043550 2573 factory.go:103] Registering Raw factory Apr 17 07:51:13.043680 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.043565 2573 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:13.043947 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.043932 2573 manager.go:319] Starting recovery of all containers Apr 17 07:51:13.044602 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.044574 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-143.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 07:51:13.044798 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.044761 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 07:51:13.053637 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.053622 2573 manager.go:324] Recovery completed Apr 17 07:51:13.055576 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.055560 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gz652" Apr 17 07:51:13.057568 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.057555 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:13.060049 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.060027 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:13.060111 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.060054 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:13.060111 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.060063 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:13.060567 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.060553 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:13.060607 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.060569 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:13.060635 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.060618 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:13.062526 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.062460 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-143.ec2.internal.18a7158cee634e6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-143.ec2.internal,UID:ip-10-0-138-143.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-143.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-143.ec2.internal,},FirstTimestamp:2026-04-17 07:51:13.060040299 +0000 UTC m=+0.403291756,LastTimestamp:2026-04-17 07:51:13.060040299 +0000 UTC m=+0.403291756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-143.ec2.internal,}" Apr 17 07:51:13.062938 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.062925 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gz652" Apr 17 07:51:13.063010 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.062996 2573 policy_none.go:49] "None policy: Start" Apr 17 07:51:13.063042 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.063018 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:13.063042 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.063030 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:13.103945 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.103930 2573 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.103958 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.103968 2573 server.go:85] "Starting device plugin registration server" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.104164 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.104175 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.104265 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.104334 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.104340 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.104926 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:13.108648 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.104957 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.157992 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.157966 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:13.159242 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.159224 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:13.159329 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.159251 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:13.159329 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.159269 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:13.159329 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.159279 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:13.159329 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.159318 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:13.163558 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.163541 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:13.204526 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.204477 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:13.205439 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.205424 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:13.205499 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.205451 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:13.205499 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.205461 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:13.205499 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.205486 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.211958 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.211939 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.211958 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.211959 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-143.ec2.internal\": node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.234028 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.234006 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.259956 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.259933 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal"] Apr 17 07:51:13.260042 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.259986 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:13.260731 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.260717 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:13.260810 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.260746 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:13.260810 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.260760 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:13.261940 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.261925 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:13.262580 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.262564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.262580 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.262575 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:13.262712 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.262593 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:13.262712 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.262603 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:13.262712 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.262617 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:13.263165 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.263151 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:13.263230 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.263177 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:13.263230 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.263191 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:13.263751 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.263731 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.263826 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.263763 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:13.264370 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.264356 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:13.264440 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.264404 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:13.264440 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.264416 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:13.285931 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.285913 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-143.ec2.internal\" not found" node="ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.289921 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.289908 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-143.ec2.internal\" not found" node="ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.334869 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.334849 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.344203 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.344182 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7929c28430e6e1d330dd3b56bc6070ba-config\") pod \"kube-apiserver-proxy-ip-10-0-138-143.ec2.internal\" (UID: \"7929c28430e6e1d330dd3b56bc6070ba\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.344254 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.344207 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8260a9f61d855193ed29ff50373ffe9d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal\" (UID: \"8260a9f61d855193ed29ff50373ffe9d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.344254 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.344226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8260a9f61d855193ed29ff50373ffe9d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal\" (UID: \"8260a9f61d855193ed29ff50373ffe9d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.435822 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.435794 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.445199 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.445180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8260a9f61d855193ed29ff50373ffe9d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal\" (UID: \"8260a9f61d855193ed29ff50373ffe9d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.445255 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.445202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7929c28430e6e1d330dd3b56bc6070ba-config\") pod \"kube-apiserver-proxy-ip-10-0-138-143.ec2.internal\" (UID: \"7929c28430e6e1d330dd3b56bc6070ba\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.445255 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.445217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8260a9f61d855193ed29ff50373ffe9d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal\" (UID: \"8260a9f61d855193ed29ff50373ffe9d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.445255 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.445249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8260a9f61d855193ed29ff50373ffe9d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal\" (UID: \"8260a9f61d855193ed29ff50373ffe9d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.445343 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.445284 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8260a9f61d855193ed29ff50373ffe9d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal\" (UID: \"8260a9f61d855193ed29ff50373ffe9d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.445343 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.445292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7929c28430e6e1d330dd3b56bc6070ba-config\") pod \"kube-apiserver-proxy-ip-10-0-138-143.ec2.internal\" (UID: \"7929c28430e6e1d330dd3b56bc6070ba\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.536651 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.536590 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.588132 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.588111 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.591793 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.591773 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:13.637538 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.637517 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.738075 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.738052 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.838597 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.838538 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.939197 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:13.939173 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:13.954658 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.954637 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:13.954796 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:13.954780 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:14.039921 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:14.039896 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:14.042074 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.042056 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:14.054173 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.054148 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:14.064983 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.064956 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:13 +0000 UTC" deadline="2027-11-12 15:31:18.877747754 +0000 UTC" Apr 17 07:51:14.064983 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.064980 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13783h40m4.812770451s" Apr 17 07:51:14.127083 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.127063 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qfrr6" Apr 17 07:51:14.132728 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.132705 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qfrr6" Apr 17 07:51:14.140712 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:14.140686 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:14.164389 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.164359 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:14.184974 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:14.184945 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7929c28430e6e1d330dd3b56bc6070ba.slice/crio-a045999095f216e61c099a4329a222b471e09b783bba8b60f822ca1ef11fb376 WatchSource:0}: Error finding container a045999095f216e61c099a4329a222b471e09b783bba8b60f822ca1ef11fb376: Status 404 returned error can't find the container with id a045999095f216e61c099a4329a222b471e09b783bba8b60f822ca1ef11fb376 Apr 17 07:51:14.185346 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:14.185328 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8260a9f61d855193ed29ff50373ffe9d.slice/crio-32592c6be1ead36ea967f7d8ac9c42fec71c950f88ca35e3b1638cb97848933d WatchSource:0}: Error finding container 32592c6be1ead36ea967f7d8ac9c42fec71c950f88ca35e3b1638cb97848933d: Status 404 returned error can't find the container with id 32592c6be1ead36ea967f7d8ac9c42fec71c950f88ca35e3b1638cb97848933d Apr 17 07:51:14.190637 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.190613 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:14.241058 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:14.241027 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:14.341477 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:14.341450 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-143.ec2.internal\" not found" Apr 17 07:51:14.377227 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.377174 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:14.442120 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.442099 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" Apr 17 07:51:14.495709 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.495555 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:14.497099 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.497087 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" Apr 17 07:51:14.505159 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.505146 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:14.620389 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:14.620351 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:15.019292 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.019266 2573 apiserver.go:52] "Watching apiserver" Apr 17 07:51:15.026126 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.026082 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:15.028622 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.028590 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kwtmq","kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc","openshift-dns/node-resolver-hslwj","openshift-image-registry/node-ca-qd97n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal","openshift-network-diagnostics/network-check-target-t6snn","kube-system/konnectivity-agent-htcxl","openshift-cluster-node-tuning-operator/tuned-wsbdc","openshift-multus/multus-additional-cni-plugins-tbd6q","openshift-multus/multus-vfldz","openshift-multus/network-metrics-daemon-jvr45","openshift-network-operator/iptables-alerter-tsfwr"] Apr 17 07:51:15.031107 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.031086 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:15.031204 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.031174 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:15.032353 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.032332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.033617 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.033598 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.036443 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036423 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.036544 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.036544 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036525 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:15.036650 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.036698 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036650 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.036750 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.036798 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036789 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n2pqw\"" Apr 17 07:51:15.036848 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.036821 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.037027 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.037008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-v8vrc\"" Apr 17 07:51:15.038052 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.038033 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.039074 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039054 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:15.039216 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039090 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:15.039216 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039145 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hrtps\"" Apr 17 07:51:15.039216 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039095 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kg88g\"" Apr 17 07:51:15.039372 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.039372 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039355 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.039372 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039360 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:15.040546 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.039827 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.040546 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.040031 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jwllh\"" Apr 17 07:51:15.040546 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.040068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.040546 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.040213 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.041704 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.041344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.041970 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.041948 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:15.042113 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042100 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.042373 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042354 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:15.042524 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:15.042625 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042607 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gp54w\"" Apr 17 07:51:15.042748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.042806 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042771 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:15.042904 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.042354 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.043299 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.043281 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:15.043583 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.043567 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gfkzv\"" Apr 17 07:51:15.043801 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.043785 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.043999 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.043986 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:15.044166 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.044150 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:15.044578 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.044401 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.044707 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.044692 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:15.044834 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.044730 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fv5cs\"" Apr 17 07:51:15.044982 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.044961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.045037 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.044989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.045076 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.045039 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:15.047038 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.047021 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:15.047118 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.047028 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:15.047328 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.047306 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:15.047545 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.047530 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xrt5x\"" Apr 17 07:51:15.053187 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8318f513-f957-48a0-821d-d6718c08e6cb-host\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.053282 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053201 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.053282 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/75720dfb-fe8b-42c4-b690-1725be056c2e-hosts-file\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.053282 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.053481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-ovnkube-config\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.053481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysconfig\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.053481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvzn\" (UniqueName: \"kubernetes.io/projected/86ccc290-4522-4b50-9bf3-c06aee8a24d6-kube-api-access-jhvzn\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.053481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-os-release\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.053481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbmj\" (UniqueName: \"kubernetes.io/projected/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-kube-api-access-fcbmj\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.053481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-ovn\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-cni-bin\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-run\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-host\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-tuned\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053612 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cnibin\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b8ce1fa1-5996-43a1-a121-443015650e07-agent-certs\") pod \"konnectivity-agent-htcxl\" (UID: \"b8ce1fa1-5996-43a1-a121-443015650e07\") " pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053658 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8938afcf-2b01-434e-9adb-48a0b9891ff1-iptables-alerter-script\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.053748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-system-cni-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-cnibin\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-kubelet\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysctl-conf\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-sys\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-conf-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-etc-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-log-socket\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-system-cni-dir\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.053994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rdm\" (UniqueName: \"kubernetes.io/projected/c0a8fdbe-d345-4229-a451-b516b5f45e25-kube-api-access-44rdm\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vhh\" (UniqueName: \"kubernetes.io/projected/75720dfb-fe8b-42c4-b690-1725be056c2e-kube-api-access-f8vhh\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcq8l\" (UniqueName: \"kubernetes.io/projected/6b3477ff-e316-477b-998e-8681a1f30139-kube-api-access-wcq8l\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysctl-d\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-socket-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-etc-kubernetes\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75720dfb-fe8b-42c4-b690-1725be056c2e-tmp-dir\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.054210 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-var-lib-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054182 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-env-overrides\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054203 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-lib-modules\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-k8s-cni-cncf-io\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8318f513-f957-48a0-821d-d6718c08e6cb-serviceca\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-run-netns\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054331 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8938afcf-2b01-434e-9adb-48a0b9891ff1-host-slash\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-registration-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054393 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-systemd\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-hostroot\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-cni-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-socket-dir-parent\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054487 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-netns\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-var-lib-kubelet\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86ccc290-4522-4b50-9bf3-c06aee8a24d6-tmp\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.055007 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-sys-fs\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxn4d\" (UniqueName: \"kubernetes.io/projected/e2903cb6-16e9-4f4e-ac73-453b4051b974-kube-api-access-xxn4d\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-os-release\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-cni-multus\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.054885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-multus-certs\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-kubelet\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0a8fdbe-d345-4229-a451-b516b5f45e25-cni-binary-copy\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.055744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-cni-bin\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-daemon-config\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b3477ff-e316-477b-998e-8681a1f30139-ovn-node-metrics-cert\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-modprobe-d\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-kubernetes\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sckg5\" (UniqueName: \"kubernetes.io/projected/7e501027-4496-4a12-a9d5-fc5c57942102-kube-api-access-sckg5\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-systemd-units\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-cni-netd\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-kubelet-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-device-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-etc-selinux\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.055996 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-slash\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.056014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-node-log\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.056034 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-ovnkube-script-lib\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.056054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b8ce1fa1-5996-43a1-a121-443015650e07-konnectivity-ca\") pod \"konnectivity-agent-htcxl\" (UID: \"b8ce1fa1-5996-43a1-a121-443015650e07\") " pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.056096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.056074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtjd\" (UniqueName: \"kubernetes.io/projected/8938afcf-2b01-434e-9adb-48a0b9891ff1-kube-api-access-kjtjd\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.056766 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.056090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxszb\" (UniqueName: \"kubernetes.io/projected/8318f513-f957-48a0-821d-d6718c08e6cb-kube-api-access-wxszb\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.056766 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.056108 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-systemd\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.108522 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.108505 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:15.134242 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.134212 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:14 +0000 UTC" deadline="2027-10-02 00:52:05.756998562 +0000 UTC" Apr 17 07:51:15.134242 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.134243 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12785h0m50.622759931s" Apr 17 07:51:15.143691 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.143672 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:15.157278 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-k8s-cni-cncf-io\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157371 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8318f513-f957-48a0-821d-d6718c08e6cb-serviceca\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.157371 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.157371 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-run-netns\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.157371 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8938afcf-2b01-434e-9adb-48a0b9891ff1-host-slash\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.157371 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-registration-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.157371 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-systemd\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-k8s-cni-cncf-io\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-hostroot\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-hostroot\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-cni-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-socket-dir-parent\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-netns\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-var-lib-kubelet\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86ccc290-4522-4b50-9bf3-c06aee8a24d6-tmp\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.157643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-sys-fs\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxn4d\" (UniqueName: \"kubernetes.io/projected/e2903cb6-16e9-4f4e-ac73-453b4051b974-kube-api-access-xxn4d\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-os-release\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-cni-multus\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-multus-certs\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-kubelet\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0a8fdbe-d345-4229-a451-b516b5f45e25-cni-binary-copy\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-cni-bin\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-daemon-config\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8318f513-f957-48a0-821d-d6718c08e6cb-serviceca\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-socket-dir-parent\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b3477ff-e316-477b-998e-8681a1f30139-ovn-node-metrics-cert\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-netns\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-modprobe-d\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.157974 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-kubernetes\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-sys-fs\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-var-lib-kubelet\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sckg5\" (UniqueName: \"kubernetes.io/projected/7e501027-4496-4a12-a9d5-fc5c57942102-kube-api-access-sckg5\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-systemd-units\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-cni-netd\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158052 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-kubelet-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-device-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-etc-selinux\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-slash\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-node-log\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-ovnkube-script-lib\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b8ce1fa1-5996-43a1-a121-443015650e07-konnectivity-ca\") pod \"konnectivity-agent-htcxl\" (UID: \"b8ce1fa1-5996-43a1-a121-443015650e07\") " pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtjd\" (UniqueName: \"kubernetes.io/projected/8938afcf-2b01-434e-9adb-48a0b9891ff1-kube-api-access-kjtjd\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxszb\" (UniqueName: \"kubernetes.io/projected/8318f513-f957-48a0-821d-d6718c08e6cb-kube-api-access-wxszb\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-systemd\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-os-release\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8318f513-f957-48a0-821d-d6718c08e6cb-host\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.158651 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158271 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158306 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-cni-multus\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/75720dfb-fe8b-42c4-b690-1725be056c2e-hosts-file\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-ovnkube-config\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysconfig\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvzn\" (UniqueName: \"kubernetes.io/projected/86ccc290-4522-4b50-9bf3-c06aee8a24d6-kube-api-access-jhvzn\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.158512 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.158608 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:15.658564854 +0000 UTC m=+3.001816299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-os-release\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-run-multus-certs\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-kubelet\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-run-netns\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-kubelet-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8938afcf-2b01-434e-9adb-48a0b9891ff1-host-slash\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.159475 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-registration-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-device-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158983 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-systemd\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-etc-selinux\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157841 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-cni-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.159060 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-slash\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.159102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-node-log\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.159316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0a8fdbe-d345-4229-a451-b516b5f45e25-cni-binary-copy\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.157789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.159400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-cni-bin\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.159719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-ovnkube-script-lib\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-daemon-config\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/75720dfb-fe8b-42c4-b690-1725be056c2e-hosts-file\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-systemd\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysconfig\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8318f513-f957-48a0-821d-d6718c08e6cb-host\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.158515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-os-release\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.160401 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-systemd-units\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbmj\" (UniqueName: \"kubernetes.io/projected/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-kube-api-access-fcbmj\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-cni-netd\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-kubernetes\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-ovn\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-cni-bin\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-run\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-host\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-tuned\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-modprobe-d\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cnibin\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b8ce1fa1-5996-43a1-a121-443015650e07-agent-certs\") pod \"konnectivity-agent-htcxl\" (UID: \"b8ce1fa1-5996-43a1-a121-443015650e07\") " pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8938afcf-2b01-434e-9adb-48a0b9891ff1-iptables-alerter-script\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-run\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161198 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-run-ovn\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-system-cni-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-cnibin\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-cnibin\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160914 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-host\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-host-cni-bin\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.160983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-kubelet\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysctl-conf\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-ovnkube-config\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-sys\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cnibin\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-conf-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-sys\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-system-cni-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-multus-conf-dir\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-etc-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysctl-conf\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.161981 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-log-socket\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-host-var-lib-kubelet\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-system-cni-dir\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-etc-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-system-cni-dir\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-log-socket\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44rdm\" (UniqueName: \"kubernetes.io/projected/c0a8fdbe-d345-4229-a451-b516b5f45e25-kube-api-access-44rdm\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8vhh\" (UniqueName: \"kubernetes.io/projected/75720dfb-fe8b-42c4-b690-1725be056c2e-kube-api-access-f8vhh\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcq8l\" (UniqueName: \"kubernetes.io/projected/6b3477ff-e316-477b-998e-8681a1f30139-kube-api-access-wcq8l\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8938afcf-2b01-434e-9adb-48a0b9891ff1-iptables-alerter-script\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysctl-d\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-socket-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161625 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-etc-kubernetes\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75720dfb-fe8b-42c4-b690-1725be056c2e-tmp-dir\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-var-lib-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-env-overrides\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.171794 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-lib-modules\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161858 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b8ce1fa1-5996-43a1-a121-443015650e07-konnectivity-ca\") pod \"konnectivity-agent-htcxl\" (UID: \"b8ce1fa1-5996-43a1-a121-443015650e07\") " pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-lib-modules\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.161969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2903cb6-16e9-4f4e-ac73-453b4051b974-socket-dir\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.162010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b3477ff-e316-477b-998e-8681a1f30139-var-lib-openvswitch\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.162081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.162090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-sysctl-d\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.162112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0a8fdbe-d345-4229-a451-b516b5f45e25-etc-kubernetes\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.162243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75720dfb-fe8b-42c4-b690-1725be056c2e-tmp-dir\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.162495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b3477ff-e316-477b-998e-8681a1f30139-env-overrides\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.164038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86ccc290-4522-4b50-9bf3-c06aee8a24d6-tmp\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.164455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b3477ff-e316-477b-998e-8681a1f30139-ovn-node-metrics-cert\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.165212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/86ccc290-4522-4b50-9bf3-c06aee8a24d6-etc-tuned\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.167048 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.167082 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.167111 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.167338 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:15.667294926 +0000 UTC m=+3.010546383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:15.172543 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.167550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxn4d\" (UniqueName: \"kubernetes.io/projected/e2903cb6-16e9-4f4e-ac73-453b4051b974-kube-api-access-xxn4d\") pod \"aws-ebs-csi-driver-node-27hbc\" (UID: \"e2903cb6-16e9-4f4e-ac73-453b4051b974\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.167663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b8ce1fa1-5996-43a1-a121-443015650e07-agent-certs\") pod \"konnectivity-agent-htcxl\" (UID: \"b8ce1fa1-5996-43a1-a121-443015650e07\") " pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.169336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtjd\" (UniqueName: \"kubernetes.io/projected/8938afcf-2b01-434e-9adb-48a0b9891ff1-kube-api-access-kjtjd\") pod \"iptables-alerter-tsfwr\" (UID: \"8938afcf-2b01-434e-9adb-48a0b9891ff1\") " pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.170962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" event={"ID":"7929c28430e6e1d330dd3b56bc6070ba","Type":"ContainerStarted","Data":"a045999095f216e61c099a4329a222b471e09b783bba8b60f822ca1ef11fb376"} Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.171276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvzn\" (UniqueName: \"kubernetes.io/projected/86ccc290-4522-4b50-9bf3-c06aee8a24d6-kube-api-access-jhvzn\") pod \"tuned-wsbdc\" (UID: \"86ccc290-4522-4b50-9bf3-c06aee8a24d6\") " pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.171934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sckg5\" (UniqueName: \"kubernetes.io/projected/7e501027-4496-4a12-a9d5-fc5c57942102-kube-api-access-sckg5\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.172339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8vhh\" (UniqueName: \"kubernetes.io/projected/75720dfb-fe8b-42c4-b690-1725be056c2e-kube-api-access-f8vhh\") pod \"node-resolver-hslwj\" (UID: \"75720dfb-fe8b-42c4-b690-1725be056c2e\") " pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.172425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbmj\" (UniqueName: \"kubernetes.io/projected/8e46b3b0-2ff5-430f-acab-a20e11bb02d0-kube-api-access-fcbmj\") pod \"multus-additional-cni-plugins-tbd6q\" (UID: \"8e46b3b0-2ff5-430f-acab-a20e11bb02d0\") " pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.173096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxszb\" (UniqueName: \"kubernetes.io/projected/8318f513-f957-48a0-821d-d6718c08e6cb-kube-api-access-wxszb\") pod \"node-ca-qd97n\" (UID: \"8318f513-f957-48a0-821d-d6718c08e6cb\") " pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.173363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.173185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcq8l\" (UniqueName: \"kubernetes.io/projected/6b3477ff-e316-477b-998e-8681a1f30139-kube-api-access-wcq8l\") pod \"ovnkube-node-kwtmq\" (UID: \"6b3477ff-e316-477b-998e-8681a1f30139\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.174405 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.174049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" event={"ID":"8260a9f61d855193ed29ff50373ffe9d","Type":"ContainerStarted","Data":"32592c6be1ead36ea967f7d8ac9c42fec71c950f88ca35e3b1638cb97848933d"} Apr 17 07:51:15.174580 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.174562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rdm\" (UniqueName: \"kubernetes.io/projected/c0a8fdbe-d345-4229-a451-b516b5f45e25-kube-api-access-44rdm\") pod \"multus-vfldz\" (UID: \"c0a8fdbe-d345-4229-a451-b516b5f45e25\") " pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.348566 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.348456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hslwj" Apr 17 07:51:15.356262 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.356238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" Apr 17 07:51:15.366278 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.366017 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:15.372858 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.372626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qd97n" Apr 17 07:51:15.378522 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.378250 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" Apr 17 07:51:15.390942 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.390924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:15.397459 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.397441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" Apr 17 07:51:15.402965 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.402948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vfldz" Apr 17 07:51:15.408459 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.408441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tsfwr" Apr 17 07:51:15.665920 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.665856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:15.666062 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.665977 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:15.666062 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.666044 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:16.666024544 +0000 UTC m=+4.009275991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:15.766648 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:15.766612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:15.766799 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.766781 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:15.766837 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.766806 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:15.766837 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.766820 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:15.766943 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:15.766888 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:16.766869486 +0000 UTC m=+4.110120931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:15.938368 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.938332 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a8fdbe_d345_4229_a451_b516b5f45e25.slice/crio-b87dbaea8147e2149be44c7b159103d2cb5d02e72701dbe4c4e59579b21d0a79 WatchSource:0}: Error finding container b87dbaea8147e2149be44c7b159103d2cb5d02e72701dbe4c4e59579b21d0a79: Status 404 returned error can't find the container with id b87dbaea8147e2149be44c7b159103d2cb5d02e72701dbe4c4e59579b21d0a79 Apr 17 07:51:15.939407 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.939363 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75720dfb_fe8b_42c4_b690_1725be056c2e.slice/crio-e10a40f61a95e97cb6e80cfeda94861f04ffdbf0b1fc7f39062554270c2d58b7 WatchSource:0}: Error finding container e10a40f61a95e97cb6e80cfeda94861f04ffdbf0b1fc7f39062554270c2d58b7: Status 404 returned error can't find the container with id e10a40f61a95e97cb6e80cfeda94861f04ffdbf0b1fc7f39062554270c2d58b7 Apr 17 07:51:15.941153 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.940856 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ce1fa1_5996_43a1_a121_443015650e07.slice/crio-dd9a4c0aeb3527b948b95733e1023bde7201cd965724d4c0f642f000664ba129 WatchSource:0}: Error finding container dd9a4c0aeb3527b948b95733e1023bde7201cd965724d4c0f642f000664ba129: Status 404 returned error can't find the container with id dd9a4c0aeb3527b948b95733e1023bde7201cd965724d4c0f642f000664ba129 Apr 17 07:51:15.942996 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.942742 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e46b3b0_2ff5_430f_acab_a20e11bb02d0.slice/crio-dace0e604e9efc778bbce3a65f1638bd7ffa210f274b8c9ccb6068b999013ad4 WatchSource:0}: Error finding container dace0e604e9efc778bbce3a65f1638bd7ffa210f274b8c9ccb6068b999013ad4: Status 404 returned error can't find the container with id dace0e604e9efc778bbce3a65f1638bd7ffa210f274b8c9ccb6068b999013ad4 Apr 17 07:51:15.942996 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.942965 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2903cb6_16e9_4f4e_ac73_453b4051b974.slice/crio-4731f4421fc50d6d173b933af9e67a12bb1d454f7131071e422c25362636c80c WatchSource:0}: Error finding container 4731f4421fc50d6d173b933af9e67a12bb1d454f7131071e422c25362636c80c: Status 404 returned error can't find the container with id 4731f4421fc50d6d173b933af9e67a12bb1d454f7131071e422c25362636c80c Apr 17 07:51:15.945690 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.945664 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8938afcf_2b01_434e_9adb_48a0b9891ff1.slice/crio-5d0694d25c39b8b86edd1ce65be95334c8721a62bc29764f8acc6b60b310ccd9 WatchSource:0}: Error finding container 5d0694d25c39b8b86edd1ce65be95334c8721a62bc29764f8acc6b60b310ccd9: Status 404 returned error can't find the container with id 5d0694d25c39b8b86edd1ce65be95334c8721a62bc29764f8acc6b60b310ccd9 Apr 17 07:51:15.946429 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.946352 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ccc290_4522_4b50_9bf3_c06aee8a24d6.slice/crio-225c8b4d9d35f56e5e1c54c651e1eba48567ec8ec7898e12ca9bd15edacc8f78 WatchSource:0}: Error finding container 225c8b4d9d35f56e5e1c54c651e1eba48567ec8ec7898e12ca9bd15edacc8f78: Status 404 returned error can't find the container with id 225c8b4d9d35f56e5e1c54c651e1eba48567ec8ec7898e12ca9bd15edacc8f78 Apr 17 07:51:15.947678 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.947602 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b3477ff_e316_477b_998e_8681a1f30139.slice/crio-8c94b4a23827c371d88f62ce51ae2f57f420099474c40353bf6bbbd927c34a49 WatchSource:0}: Error finding container 8c94b4a23827c371d88f62ce51ae2f57f420099474c40353bf6bbbd927c34a49: Status 404 returned error can't find the container with id 8c94b4a23827c371d88f62ce51ae2f57f420099474c40353bf6bbbd927c34a49 Apr 17 07:51:15.948428 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:15.948404 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8318f513_f957_48a0_821d_d6718c08e6cb.slice/crio-5e9509629a11d1baad75afccb720d8d01b6dd5f1ddaaa8f582a96ac7abe3b3c9 WatchSource:0}: Error finding container 5e9509629a11d1baad75afccb720d8d01b6dd5f1ddaaa8f582a96ac7abe3b3c9: Status 404 returned error can't find the container with id 5e9509629a11d1baad75afccb720d8d01b6dd5f1ddaaa8f582a96ac7abe3b3c9 Apr 17 07:51:16.135070 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.135033 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:14 +0000 UTC" deadline="2027-12-17 09:31:03.213297446 +0000 UTC" Apr 17 07:51:16.135070 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.135069 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14617h39m47.078232458s" Apr 17 07:51:16.176580 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.176520 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" event={"ID":"7929c28430e6e1d330dd3b56bc6070ba","Type":"ContainerStarted","Data":"972a9a3eeea69f7d955ab582504303e2b92b1c09dc264e2a01ec8a75e6318ae3"} Apr 17 07:51:16.177576 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.177546 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" event={"ID":"86ccc290-4522-4b50-9bf3-c06aee8a24d6","Type":"ContainerStarted","Data":"225c8b4d9d35f56e5e1c54c651e1eba48567ec8ec7898e12ca9bd15edacc8f78"} Apr 17 07:51:16.178722 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.178692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tsfwr" event={"ID":"8938afcf-2b01-434e-9adb-48a0b9891ff1","Type":"ContainerStarted","Data":"5d0694d25c39b8b86edd1ce65be95334c8721a62bc29764f8acc6b60b310ccd9"} Apr 17 07:51:16.179744 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.179722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" event={"ID":"e2903cb6-16e9-4f4e-ac73-453b4051b974","Type":"ContainerStarted","Data":"4731f4421fc50d6d173b933af9e67a12bb1d454f7131071e422c25362636c80c"} Apr 17 07:51:16.180664 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.180643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-htcxl" event={"ID":"b8ce1fa1-5996-43a1-a121-443015650e07","Type":"ContainerStarted","Data":"dd9a4c0aeb3527b948b95733e1023bde7201cd965724d4c0f642f000664ba129"} Apr 17 07:51:16.181607 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.181587 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hslwj" event={"ID":"75720dfb-fe8b-42c4-b690-1725be056c2e","Type":"ContainerStarted","Data":"e10a40f61a95e97cb6e80cfeda94861f04ffdbf0b1fc7f39062554270c2d58b7"} Apr 17 07:51:16.182745 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.182723 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"8c94b4a23827c371d88f62ce51ae2f57f420099474c40353bf6bbbd927c34a49"} Apr 17 07:51:16.183602 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.183581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qd97n" event={"ID":"8318f513-f957-48a0-821d-d6718c08e6cb","Type":"ContainerStarted","Data":"5e9509629a11d1baad75afccb720d8d01b6dd5f1ddaaa8f582a96ac7abe3b3c9"} Apr 17 07:51:16.184678 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.184660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerStarted","Data":"dace0e604e9efc778bbce3a65f1638bd7ffa210f274b8c9ccb6068b999013ad4"} Apr 17 07:51:16.185668 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.185651 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfldz" event={"ID":"c0a8fdbe-d345-4229-a451-b516b5f45e25","Type":"ContainerStarted","Data":"b87dbaea8147e2149be44c7b159103d2cb5d02e72701dbe4c4e59579b21d0a79"} Apr 17 07:51:16.188493 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.188447 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-143.ec2.internal" podStartSLOduration=2.188433992 podStartE2EDuration="2.188433992s" podCreationTimestamp="2026-04-17 07:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:16.187964726 +0000 UTC m=+3.531216193" watchObservedRunningTime="2026-04-17 07:51:16.188433992 +0000 UTC m=+3.531685457" Apr 17 07:51:16.673585 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.672988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:16.673585 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:16.673154 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:16.673585 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:16.673221 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:18.673202867 +0000 UTC m=+6.016454313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:16.774009 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:16.773935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:16.774147 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:16.774085 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:16.774147 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:16.774103 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:16.774147 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:16.774117 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:16.774315 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:16.774174 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:18.774156646 +0000 UTC m=+6.117408091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:17.160862 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:17.160761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:17.161322 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:17.160895 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:17.161404 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:17.161358 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:17.161541 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:17.161496 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:17.211116 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:17.211083 2573 generic.go:358] "Generic (PLEG): container finished" podID="8260a9f61d855193ed29ff50373ffe9d" containerID="fb1f72812a06842ad5cd3fb30bc831b604e710c2f647ec9c7aa90bd747773a7e" exitCode=0 Apr 17 07:51:17.211253 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:17.211121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" event={"ID":"8260a9f61d855193ed29ff50373ffe9d","Type":"ContainerDied","Data":"fb1f72812a06842ad5cd3fb30bc831b604e710c2f647ec9c7aa90bd747773a7e"} Apr 17 07:51:18.230830 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:18.230778 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" event={"ID":"8260a9f61d855193ed29ff50373ffe9d","Type":"ContainerStarted","Data":"6c6f88e5ec9ac35aa223e396393acf0f8c96ef88f4d44aad0c06161f46947cac"} Apr 17 07:51:18.687951 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:18.687870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:18.688113 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:18.688011 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:18.688113 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:18.688073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:22.688054241 +0000 UTC m=+10.031305694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:18.788719 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:18.788680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:18.788901 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:18.788887 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:18.788960 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:18.788906 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:18.788960 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:18.788918 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:18.789070 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:18.788973 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:22.788955958 +0000 UTC m=+10.132207412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:19.162242 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:19.162162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:19.162428 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:19.162308 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:19.162895 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:19.162702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:19.162895 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:19.162786 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:20.712817 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.712759 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-143.ec2.internal" podStartSLOduration=6.712740312 podStartE2EDuration="6.712740312s" podCreationTimestamp="2026-04-17 07:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:18.246657822 +0000 UTC m=+5.589909285" watchObservedRunningTime="2026-04-17 07:51:20.712740312 +0000 UTC m=+8.055991793" Apr 17 07:51:20.713490 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.713454 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lm9nr"] Apr 17 07:51:20.716469 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.716447 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.716579 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:20.716536 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:20.804770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.804540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9ce8efaa-a4ae-457a-b417-1aa180ef551f-dbus\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.804770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.804591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.804770 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.804686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9ce8efaa-a4ae-457a-b417-1aa180ef551f-kubelet-config\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.905554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9ce8efaa-a4ae-457a-b417-1aa180ef551f-kubelet-config\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.905600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9ce8efaa-a4ae-457a-b417-1aa180ef551f-dbus\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.905627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:20.905743 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:20.905801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret podName:9ce8efaa-a4ae-457a-b417-1aa180ef551f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:21.405782703 +0000 UTC m=+8.749034161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret") pod "global-pull-secret-syncer-lm9nr" (UID: "9ce8efaa-a4ae-457a-b417-1aa180ef551f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.906058 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9ce8efaa-a4ae-457a-b417-1aa180ef551f-kubelet-config\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:20.906239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:20.906201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9ce8efaa-a4ae-457a-b417-1aa180ef551f-dbus\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:21.162232 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:21.162151 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:21.162364 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:21.162281 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:21.162706 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:21.162680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:21.162829 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:21.162789 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:21.409734 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:21.409608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:21.409891 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:21.409785 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:21.409891 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:21.409852 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret podName:9ce8efaa-a4ae-457a-b417-1aa180ef551f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:22.409834806 +0000 UTC m=+9.753086252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret") pod "global-pull-secret-syncer-lm9nr" (UID: "9ce8efaa-a4ae-457a-b417-1aa180ef551f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:22.159893 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:22.159711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:22.159893 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.159846 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:22.418554 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:22.418467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:22.418715 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.418628 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:22.418715 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.418704 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret podName:9ce8efaa-a4ae-457a-b417-1aa180ef551f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:24.418682907 +0000 UTC m=+11.761934351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret") pod "global-pull-secret-syncer-lm9nr" (UID: "9ce8efaa-a4ae-457a-b417-1aa180ef551f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:22.720619 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:22.720536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:22.720772 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.720698 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:22.720772 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.720762 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.720742342 +0000 UTC m=+18.063993784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:22.821806 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:22.821774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:22.822046 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.822027 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:22.822116 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.822055 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:22.822116 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.822069 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:22.822250 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:22.822129 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.822111728 +0000 UTC m=+18.165363169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:23.160782 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:23.160758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:23.161143 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:23.160842 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:23.161143 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:23.161002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:23.161261 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:23.161189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:24.160283 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:24.159804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:24.160283 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:24.159940 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:24.435012 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:24.434460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:24.435012 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:24.434605 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:24.435012 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:24.434663 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret podName:9ce8efaa-a4ae-457a-b417-1aa180ef551f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:28.434645621 +0000 UTC m=+15.777897063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret") pod "global-pull-secret-syncer-lm9nr" (UID: "9ce8efaa-a4ae-457a-b417-1aa180ef551f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:25.160421 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:25.160348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:25.160509 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:25.160436 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:25.160571 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:25.160548 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:25.160670 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:25.160637 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:26.159953 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.159730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:26.160777 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:26.159996 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:26.256796 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.256764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" event={"ID":"86ccc290-4522-4b50-9bf3-c06aee8a24d6","Type":"ContainerStarted","Data":"af82134403ce0771721433e31db25eeab2961a4082230c69f6b26ba6995fa600"} Apr 17 07:51:26.258831 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.258802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" event={"ID":"e2903cb6-16e9-4f4e-ac73-453b4051b974","Type":"ContainerStarted","Data":"78834f611d5de3607e451f2007c009d89f45beaf8c30443e2cb4a9de061d5df0"} Apr 17 07:51:26.261030 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.260678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-htcxl" event={"ID":"b8ce1fa1-5996-43a1-a121-443015650e07","Type":"ContainerStarted","Data":"126c295a2fc2c97b02a1346549287343e493510c33dcc1b3fb94a70ff97a8116"} Apr 17 07:51:26.262116 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.262081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hslwj" event={"ID":"75720dfb-fe8b-42c4-b690-1725be056c2e","Type":"ContainerStarted","Data":"78ab3a588b4dbb02e8f7e980642fd371945f0f2b697fc72616e2ab38e9ef667b"} Apr 17 07:51:26.264805 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.264783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qd97n" event={"ID":"8318f513-f957-48a0-821d-d6718c08e6cb","Type":"ContainerStarted","Data":"a5f9a54406ca2b3fd4028279e09e38a7ea28452977ae0e1389690d2bdf2b393a"} Apr 17 07:51:26.266293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.266251 2573 generic.go:358] "Generic (PLEG): container finished" podID="8e46b3b0-2ff5-430f-acab-a20e11bb02d0" containerID="4bb05555c70ae00186e07757de1d524226947282ebda65e3dc8700aede0042d3" exitCode=0 Apr 17 07:51:26.266293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.266292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerDied","Data":"4bb05555c70ae00186e07757de1d524226947282ebda65e3dc8700aede0042d3"} Apr 17 07:51:26.270506 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.270469 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wsbdc" podStartSLOduration=4.024390157 podStartE2EDuration="13.270454267s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.948230801 +0000 UTC m=+3.291482241" lastFinishedPulling="2026-04-17 07:51:25.194294896 +0000 UTC m=+12.537546351" observedRunningTime="2026-04-17 07:51:26.269798838 +0000 UTC m=+13.613050300" watchObservedRunningTime="2026-04-17 07:51:26.270454267 +0000 UTC m=+13.613705731" Apr 17 07:51:26.281487 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.281452 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hslwj" podStartSLOduration=4.243113384 podStartE2EDuration="13.281442361s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.941028441 +0000 UTC m=+3.284279882" lastFinishedPulling="2026-04-17 07:51:24.979357418 +0000 UTC m=+12.322608859" observedRunningTime="2026-04-17 07:51:26.281367854 +0000 UTC m=+13.624619316" watchObservedRunningTime="2026-04-17 07:51:26.281442361 +0000 UTC m=+13.624693821" Apr 17 07:51:26.315879 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.315841 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qd97n" podStartSLOduration=4.288981958 podStartE2EDuration="13.315831731s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.951298556 +0000 UTC m=+3.294550005" lastFinishedPulling="2026-04-17 07:51:24.978148337 +0000 UTC m=+12.321399778" observedRunningTime="2026-04-17 07:51:26.315765657 +0000 UTC m=+13.659017119" watchObservedRunningTime="2026-04-17 07:51:26.315831731 +0000 UTC m=+13.659083194" Apr 17 07:51:26.315960 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.315944 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-htcxl" podStartSLOduration=4.275632508 podStartE2EDuration="13.315939765s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.943761919 +0000 UTC m=+3.287013364" lastFinishedPulling="2026-04-17 07:51:24.984069175 +0000 UTC m=+12.327320621" observedRunningTime="2026-04-17 07:51:26.294608579 +0000 UTC m=+13.637860042" watchObservedRunningTime="2026-04-17 07:51:26.315939765 +0000 UTC m=+13.659191225" Apr 17 07:51:26.654615 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.654583 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:26.655299 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:26.655278 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:27.159483 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:27.159455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:27.159653 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:27.159487 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:27.159653 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:27.159571 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:27.159787 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:27.159685 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:27.270023 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:27.269986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tsfwr" event={"ID":"8938afcf-2b01-434e-9adb-48a0b9891ff1","Type":"ContainerStarted","Data":"4662c91cfd15337dc77a045e8577ad9e86b0ac2ed18bda4540b8e5d32cc476e3"} Apr 17 07:51:27.281373 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:27.281317 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tsfwr" podStartSLOduration=5.249687725 podStartE2EDuration="14.281303303s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.948294695 +0000 UTC m=+3.291546142" lastFinishedPulling="2026-04-17 07:51:24.979910278 +0000 UTC m=+12.323161720" observedRunningTime="2026-04-17 07:51:27.281085948 +0000 UTC m=+14.624337412" watchObservedRunningTime="2026-04-17 07:51:27.281303303 +0000 UTC m=+14.624554765" Apr 17 07:51:28.159733 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:28.159699 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:28.159886 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:28.159820 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:28.271777 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:28.271745 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:51:28.468008 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:28.467920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:28.468168 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:28.468056 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:28.468168 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:28.468127 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret podName:9ce8efaa-a4ae-457a-b417-1aa180ef551f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:36.468107892 +0000 UTC m=+23.811359358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret") pod "global-pull-secret-syncer-lm9nr" (UID: "9ce8efaa-a4ae-457a-b417-1aa180ef551f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:29.159833 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:29.159641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:29.160011 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:29.159730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:29.160011 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:29.159932 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:29.160128 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:29.160012 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:30.160207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:30.160165 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:30.160749 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.160298 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:30.786757 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:30.786716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:30.786946 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.786854 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.786946 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.786920 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:46.786900507 +0000 UTC m=+34.130151947 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.887597 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:30.887568 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:30.887741 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.887708 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:30.887741 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.887731 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:30.887841 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.887743 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.887841 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:30.887795 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:46.887781345 +0000 UTC m=+34.231032786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.983953 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:30.983922 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:30.984100 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:30.984067 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:51:30.984836 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:30.984811 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-htcxl" Apr 17 07:51:31.159966 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:31.159893 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:31.160103 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:31.159893 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:31.160103 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:31.160009 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:31.160202 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:31.160118 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:32.160066 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:32.160031 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:32.160503 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:32.160134 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:33.161036 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:33.161000 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:33.161530 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:33.161122 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:33.161530 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:33.161189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:33.161530 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:33.161295 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:34.159896 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:34.159853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:34.160082 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:34.159978 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:35.159776 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:35.159565 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:35.159776 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:35.159594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:35.159776 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:35.159695 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:35.160341 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:35.159821 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:36.159752 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:36.159720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:36.159921 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:36.159827 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:36.521834 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:36.521812 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:51:36.531643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:36.531622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:36.531780 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:36.531762 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:36.531825 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:36.531816 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret podName:9ce8efaa-a4ae-457a-b417-1aa180ef551f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:52.531802632 +0000 UTC m=+39.875054073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret") pod "global-pull-secret-syncer-lm9nr" (UID: "9ce8efaa-a4ae-457a-b417-1aa180ef551f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:37.117487 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.117142 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:51:36.521830148Z","UUID":"8bc78a39-03b9-4f85-aaea-5e69bc65512c","Handler":null,"Name":"","Endpoint":""} Apr 17 07:51:37.119674 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.119654 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:51:37.119778 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.119683 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:51:37.163028 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.163009 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:37.163564 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.163009 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:37.163564 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:37.163101 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:37.163564 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:37.163178 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:37.287605 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.287572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" event={"ID":"e2903cb6-16e9-4f4e-ac73-453b4051b974","Type":"ContainerStarted","Data":"780a00797867619526bac14e70f8c78403433ec7b758e25152efb47f6f29978a"} Apr 17 07:51:37.290115 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.290087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"f065c0f4a506dc4cea0b4c359b02078bc4c88ffa92adc4009dacb01382d7fb32"} Apr 17 07:51:37.290214 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.290119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"4197bd7ce372b757a01962948922137e17654da86f1a8a4f76c4b55f107da3bd"} Apr 17 07:51:37.290214 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.290131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"78d0f800b6afe2d25140d2d5ec7983e896ff4c200d238cce76052a7ea162700c"} Apr 17 07:51:37.290214 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.290144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"245e1b051e2a2745f807d258f60b66b4d9e13cad1a9310612fe7899a006da188"} Apr 17 07:51:37.290214 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.290156 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"2fc7659dd25539e9bedc9adb1ea6115d36a2e119f7112f1c843a66d39c1caac5"} Apr 17 07:51:37.290214 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.290163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"dc6a85a77366e84d1336890af67c8115e6d39618af796e904f54a819e99a224e"} Apr 17 07:51:37.291573 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.291549 2573 generic.go:358] "Generic (PLEG): container finished" podID="8e46b3b0-2ff5-430f-acab-a20e11bb02d0" containerID="f5d3e6452833bb283f40d0a83df1221afc16292bd5abfb49fb844e2883ebd0dd" exitCode=0 Apr 17 07:51:37.291663 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.291619 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerDied","Data":"f5d3e6452833bb283f40d0a83df1221afc16292bd5abfb49fb844e2883ebd0dd"} Apr 17 07:51:37.292868 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.292844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfldz" event={"ID":"c0a8fdbe-d345-4229-a451-b516b5f45e25","Type":"ContainerStarted","Data":"d8d90e3f021d0a046929f7df3582895b6b0e10b4a36bf9d0d6c989b6c5f9da79"} Apr 17 07:51:37.338925 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:37.338891 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vfldz" podStartSLOduration=3.964342319 podStartE2EDuration="24.338879291s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.939913338 +0000 UTC m=+3.283164782" lastFinishedPulling="2026-04-17 07:51:36.314450313 +0000 UTC m=+23.657701754" observedRunningTime="2026-04-17 07:51:37.338603244 +0000 UTC m=+24.681854708" watchObservedRunningTime="2026-04-17 07:51:37.338879291 +0000 UTC m=+24.682130754" Apr 17 07:51:38.159911 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:38.159884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:38.160159 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:38.159989 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:39.162580 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.162558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:39.162839 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.162567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:39.162839 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:39.162639 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:39.162839 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:39.162713 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:39.297698 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.297672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" event={"ID":"e2903cb6-16e9-4f4e-ac73-453b4051b974","Type":"ContainerStarted","Data":"e289abe3f6f6c15d50462a54b4cdf08688c2f9190f63e54c3c723c3fabe48cf2"} Apr 17 07:51:39.300481 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.300458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"0ab287ba64cc9cbcb4bb418530af8ffc8373126766d0375d1949e95273d3816c"} Apr 17 07:51:39.302168 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.302148 2573 generic.go:358] "Generic (PLEG): container finished" podID="8e46b3b0-2ff5-430f-acab-a20e11bb02d0" containerID="41285a5bbed02dfbca4e9f34593da925da1ca76918235bc12e57d10c87d81c42" exitCode=0 Apr 17 07:51:39.302261 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.302179 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerDied","Data":"41285a5bbed02dfbca4e9f34593da925da1ca76918235bc12e57d10c87d81c42"} Apr 17 07:51:39.312968 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:39.312932 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27hbc" podStartSLOduration=4.005105035 podStartE2EDuration="26.312921633s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.94499243 +0000 UTC m=+3.288243871" lastFinishedPulling="2026-04-17 07:51:38.252809028 +0000 UTC m=+25.596060469" observedRunningTime="2026-04-17 07:51:39.312769643 +0000 UTC m=+26.656021105" watchObservedRunningTime="2026-04-17 07:51:39.312921633 +0000 UTC m=+26.656173093" Apr 17 07:51:40.160442 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:40.160415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:40.160591 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:40.160513 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:41.162984 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:41.162957 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:41.162984 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:41.162980 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:41.163402 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:41.163049 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:41.163402 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:41.163185 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:41.307398 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:41.307339 2573 generic.go:358] "Generic (PLEG): container finished" podID="8e46b3b0-2ff5-430f-acab-a20e11bb02d0" containerID="89cec4fd7870cb59a07fff2da8d5b607a88211988748dcf329fe0a7302bdd4c8" exitCode=0 Apr 17 07:51:41.307480 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:41.307397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerDied","Data":"89cec4fd7870cb59a07fff2da8d5b607a88211988748dcf329fe0a7302bdd4c8"} Apr 17 07:51:42.160274 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:42.160243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:42.160447 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:42.160344 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:42.312101 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:42.312066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" event={"ID":"6b3477ff-e316-477b-998e-8681a1f30139","Type":"ContainerStarted","Data":"073646514b4bfe02ce820f243d9a88f7884133d1a8c2a4c9843b26f4e2bcba6c"} Apr 17 07:51:42.312483 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:42.312343 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:42.327240 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:42.327215 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:42.336936 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:42.336894 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" podStartSLOduration=8.972015148 podStartE2EDuration="29.336882084s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.949578069 +0000 UTC m=+3.292829514" lastFinishedPulling="2026-04-17 07:51:36.314444995 +0000 UTC m=+23.657696450" observedRunningTime="2026-04-17 07:51:42.335887007 +0000 UTC m=+29.679138504" watchObservedRunningTime="2026-04-17 07:51:42.336882084 +0000 UTC m=+29.680133546" Apr 17 07:51:43.160866 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.160840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:43.160995 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:43.160920 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:43.160995 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.160927 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:43.160995 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:43.160988 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:43.251608 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.251573 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t6snn"] Apr 17 07:51:43.254521 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.254490 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jvr45"] Apr 17 07:51:43.254969 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.254946 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lm9nr"] Apr 17 07:51:43.255077 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.255055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:43.255156 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:43.255139 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:43.313633 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.313602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:43.314029 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.313654 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:43.314029 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:43.313800 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:43.314029 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.313888 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:43.314301 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:43.314271 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:43.314340 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.314316 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:43.328435 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:43.328416 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:51:45.160470 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:45.160441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:45.160470 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:45.160496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:45.161020 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:45.160587 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:45.161020 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:45.160624 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:45.161020 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:45.160725 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:45.161020 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:45.160827 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:46.807047 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:46.806794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:46.807456 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:46.806942 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:46.807456 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:46.807203 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:18.807177214 +0000 UTC m=+66.150428659 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:46.908012 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:46.907985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:46.908203 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:46.908182 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:46.908251 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:46.908211 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:46.908251 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:46.908226 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8rr5c for pod openshift-network-diagnostics/network-check-target-t6snn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:46.908363 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:46.908284 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c podName:129c1cb1-8484-40fe-b434-4354aab1d142 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:18.908265714 +0000 UTC m=+66.251517160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8rr5c" (UniqueName: "kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c") pod "network-check-target-t6snn" (UID: "129c1cb1-8484-40fe-b434-4354aab1d142") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:47.162783 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:47.162714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:47.162908 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:47.162714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:47.162908 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:47.162810 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:47.162908 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:47.162714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:47.162908 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:47.162877 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:47.163055 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:47.162943 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:49.162662 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:49.162632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:49.163283 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:49.162664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:49.163283 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:49.162632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:49.163283 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:49.162746 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t6snn" podUID="129c1cb1-8484-40fe-b434-4354aab1d142" Apr 17 07:51:49.163283 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:49.162839 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lm9nr" podUID="9ce8efaa-a4ae-457a-b417-1aa180ef551f" Apr 17 07:51:49.163283 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:49.162924 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:51:49.963057 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:49.962989 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-143.ec2.internal" event="NodeReady" Apr 17 07:51:49.963157 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:49.963127 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:51:49.998710 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:49.998684 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7"] Apr 17 07:51:50.037201 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.037180 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cd847b7b9-j4hg8"] Apr 17 07:51:50.037355 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.037338 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.040239 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.040216 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 07:51:50.040474 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.040453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 07:51:50.040644 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.040627 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-cvswh\"" Apr 17 07:51:50.040832 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.040815 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 07:51:50.040899 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.040833 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 07:51:50.056965 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.056947 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg"] Apr 17 07:51:50.057124 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.057105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.058942 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.058921 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:51:50.059293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.059276 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:51:50.059403 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.059291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6xw8q\"" Apr 17 07:51:50.059403 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.059276 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:51:50.064994 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.064967 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:51:50.081127 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.081106 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx"] Apr 17 07:51:50.081250 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.081234 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.083283 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.083261 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 07:51:50.108008 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.107987 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7"] Apr 17 07:51:50.108008 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.108010 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cd847b7b9-j4hg8"] Apr 17 07:51:50.108135 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.108020 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx"] Apr 17 07:51:50.108135 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.108030 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pw7cp"] Apr 17 07:51:50.108195 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.108165 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.110643 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.110618 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 07:51:50.110752 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.110609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 07:51:50.110752 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.110709 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 07:51:50.110994 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.110977 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 07:51:50.126488 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.126467 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sxlhk"] Apr 17 07:51:50.126633 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.126616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.128682 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.128655 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:51:50.128789 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.128667 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:51:50.128789 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.128726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9vrx5\"" Apr 17 07:51:50.128895 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.128794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:51:50.130488 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.130468 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvlv\" (UniqueName: \"kubernetes.io/projected/34a606cd-fd5c-4966-bcee-663ef861c625-kube-api-access-ggvlv\") pod \"managed-serviceaccount-addon-agent-66f6555b97-p2pm7\" (UID: \"34a606cd-fd5c-4966-bcee-663ef861c625\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.130582 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.130500 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/34a606cd-fd5c-4966-bcee-663ef861c625-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66f6555b97-p2pm7\" (UID: \"34a606cd-fd5c-4966-bcee-663ef861c625\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.141080 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.141058 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg"] Apr 17 07:51:50.141190 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.141088 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pw7cp"] Apr 17 07:51:50.141190 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.141107 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sxlhk"] Apr 17 07:51:50.141293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.141201 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.143316 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.143299 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:51:50.143429 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.143398 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mw88v\"" Apr 17 07:51:50.143484 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.143426 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:51:50.231238 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdphg\" (UniqueName: \"kubernetes.io/projected/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-kube-api-access-cdphg\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.231238 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmrt6\" (UniqueName: \"kubernetes.io/projected/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-kube-api-access-nmrt6\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/34a606cd-fd5c-4966-bcee-663ef861c625-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66f6555b97-p2pm7\" (UID: \"34a606cd-fd5c-4966-bcee-663ef861c625\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b907fba-a1e3-48d5-ac78-6b5209129b2b-ca-trust-extracted\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-trusted-ca\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-installation-pull-secrets\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-bound-sa-token\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-ca\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk42j\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-kube-api-access-gk42j\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8edee96b-f66b-4c8b-af3b-de25372ba529-tmp\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8edee96b-f66b-4c8b-af3b-de25372ba529-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk7lc\" (UniqueName: \"kubernetes.io/projected/8edee96b-f66b-4c8b-af3b-de25372ba529-kube-api-access-sk7lc\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.231855 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-hub\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-certificates\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-config-volume\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvlv\" (UniqueName: \"kubernetes.io/projected/34a606cd-fd5c-4966-bcee-663ef861c625-kube-api-access-ggvlv\") pod \"managed-serviceaccount-addon-agent-66f6555b97-p2pm7\" (UID: \"34a606cd-fd5c-4966-bcee-663ef861c625\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.231932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-image-registry-private-configuration\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.232026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-tmp-dir\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.232323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.232057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pjfz\" (UniqueName: \"kubernetes.io/projected/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-kube-api-access-6pjfz\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.235390 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.235360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/34a606cd-fd5c-4966-bcee-663ef861c625-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66f6555b97-p2pm7\" (UID: \"34a606cd-fd5c-4966-bcee-663ef861c625\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.239609 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.239589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvlv\" (UniqueName: \"kubernetes.io/projected/34a606cd-fd5c-4966-bcee-663ef861c625-kube-api-access-ggvlv\") pod \"managed-serviceaccount-addon-agent-66f6555b97-p2pm7\" (UID: \"34a606cd-fd5c-4966-bcee-663ef861c625\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.328216 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.328187 2573 generic.go:358] "Generic (PLEG): container finished" podID="8e46b3b0-2ff5-430f-acab-a20e11bb02d0" containerID="9b29dd7cab147c2820431edc4af6b58785c2d37be332ca916a935d04acceeeff" exitCode=0 Apr 17 07:51:50.328307 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.328229 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerDied","Data":"9b29dd7cab147c2820431edc4af6b58785c2d37be332ca916a935d04acceeeff"} Apr 17 07:51:50.332645 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-hub\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.332712 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.332712 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.332800 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-certificates\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.332800 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-config-volume\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.332800 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-image-registry-private-configuration\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.332941 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-tmp-dir\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.332941 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.332853 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:51:50.332941 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.332873 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:51:50.332941 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.332937 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:50.83291857 +0000 UTC m=+38.176170017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:51:50.333130 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pjfz\" (UniqueName: \"kubernetes.io/projected/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-kube-api-access-6pjfz\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.333130 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.332996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdphg\" (UniqueName: \"kubernetes.io/projected/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-kube-api-access-cdphg\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.333130 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmrt6\" (UniqueName: \"kubernetes.io/projected/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-kube-api-access-nmrt6\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.333130 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333108 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-tmp-dir\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.333335 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.333335 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.333335 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.333335 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b907fba-a1e3-48d5-ac78-6b5209129b2b-ca-trust-extracted\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.333335 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-trusted-ca\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-installation-pull-secrets\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-bound-sa-token\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-ca\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-certificates\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk42j\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-kube-api-access-gk42j\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8edee96b-f66b-4c8b-af3b-de25372ba529-tmp\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8edee96b-f66b-4c8b-af3b-de25372ba529-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.333623 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk7lc\" (UniqueName: \"kubernetes.io/projected/8edee96b-f66b-4c8b-af3b-de25372ba529-kube-api-access-sk7lc\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.334103 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.333843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-config-volume\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.334103 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.333941 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:50.334103 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.333999 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:50.833981843 +0000 UTC m=+38.177233300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:51:50.334273 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.334192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b907fba-a1e3-48d5-ac78-6b5209129b2b-ca-trust-extracted\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.334326 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.334311 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.334403 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.334336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8edee96b-f66b-4c8b-af3b-de25372ba529-tmp\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.334926 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.334822 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:50.334926 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.334878 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:50.834862811 +0000 UTC m=+38.178114256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:51:50.334926 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.334823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-trusted-ca\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.336531 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.336498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.336699 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.336668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-hub\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.336987 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.336963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.337233 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.337213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-ca\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.337306 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.337286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8edee96b-f66b-4c8b-af3b-de25372ba529-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.341351 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.341332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmrt6\" (UniqueName: \"kubernetes.io/projected/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-kube-api-access-nmrt6\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.341681 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.341659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pjfz\" (UniqueName: \"kubernetes.io/projected/5d8e5bc6-9fc7-432b-b033-23da46d4d68a-kube-api-access-6pjfz\") pod \"cluster-proxy-proxy-agent-695ffd4769-lmcwx\" (UID: \"5d8e5bc6-9fc7-432b-b033-23da46d4d68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.342256 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.342221 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk7lc\" (UniqueName: \"kubernetes.io/projected/8edee96b-f66b-4c8b-af3b-de25372ba529-kube-api-access-sk7lc\") pod \"klusterlet-addon-workmgr-7d8c787ff6-xwqxg\" (UID: \"8edee96b-f66b-4c8b-af3b-de25372ba529\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.345442 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.345422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-installation-pull-secrets\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.345561 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.345462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-image-registry-private-configuration\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.347229 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.347212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdphg\" (UniqueName: \"kubernetes.io/projected/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-kube-api-access-cdphg\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.355750 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.355729 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" Apr 17 07:51:50.356448 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.356431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-bound-sa-token\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.356653 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.356632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk42j\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-kube-api-access-gk42j\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.389435 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.389411 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:50.416490 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.416142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:51:50.527813 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.527541 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7"] Apr 17 07:51:50.543293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.543188 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg"] Apr 17 07:51:50.554988 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:50.547339 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a606cd_fd5c_4966_bcee_663ef861c625.slice/crio-79f2b6b6513144b8a0271b9c29d08fcb3764ea4e02450a87ea2f6b2d4d35db2b WatchSource:0}: Error finding container 79f2b6b6513144b8a0271b9c29d08fcb3764ea4e02450a87ea2f6b2d4d35db2b: Status 404 returned error can't find the container with id 79f2b6b6513144b8a0271b9c29d08fcb3764ea4e02450a87ea2f6b2d4d35db2b Apr 17 07:51:50.584309 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.584286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx"] Apr 17 07:51:50.586867 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:50.586841 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d8e5bc6_9fc7_432b_b033_23da46d4d68a.slice/crio-146396eabdf450fb855565daa2d8119a432876209f5a825739055092ee3d779b WatchSource:0}: Error finding container 146396eabdf450fb855565daa2d8119a432876209f5a825739055092ee3d779b: Status 404 returned error can't find the container with id 146396eabdf450fb855565daa2d8119a432876209f5a825739055092ee3d779b Apr 17 07:51:50.844295 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.844219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:50.844295 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.844275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:50.844486 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844354 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:51:50.844486 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844360 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:50.844486 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844364 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:51:50.844486 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844447 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:51.844432585 +0000 UTC m=+39.187684025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:51:50.844659 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:50.844500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:50.844659 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844532 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:51.844512531 +0000 UTC m=+39.187763981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:51:50.844659 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844555 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:50.844659 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:50.844597 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:51.844584783 +0000 UTC m=+39.187836225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:51:51.164599 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.163974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:51:51.164599 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.163974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:51.164599 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.163986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:51:51.167293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.167108 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:51:51.167293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.167127 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:51:51.167293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.167146 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:51:51.167554 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.167350 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bmlbc\"" Apr 17 07:51:51.167554 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.167482 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjzzs\"" Apr 17 07:51:51.167554 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.167520 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:51:51.331279 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.331241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" event={"ID":"8edee96b-f66b-4c8b-af3b-de25372ba529","Type":"ContainerStarted","Data":"761154fccf8d8aff554d8af6d0553355c181bf26c20d23db333b2999028a6ca0"} Apr 17 07:51:51.336429 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.336402 2573 generic.go:358] "Generic (PLEG): container finished" podID="8e46b3b0-2ff5-430f-acab-a20e11bb02d0" containerID="6b0a7ae6a93831fd0c76e73f63b23d820802f8162db6619473cae19ddd686dec" exitCode=0 Apr 17 07:51:51.336558 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.336471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerDied","Data":"6b0a7ae6a93831fd0c76e73f63b23d820802f8162db6619473cae19ddd686dec"} Apr 17 07:51:51.341947 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.341911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" event={"ID":"5d8e5bc6-9fc7-432b-b033-23da46d4d68a","Type":"ContainerStarted","Data":"146396eabdf450fb855565daa2d8119a432876209f5a825739055092ee3d779b"} Apr 17 07:51:51.343363 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.343327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" event={"ID":"34a606cd-fd5c-4966-bcee-663ef861c625","Type":"ContainerStarted","Data":"79f2b6b6513144b8a0271b9c29d08fcb3764ea4e02450a87ea2f6b2d4d35db2b"} Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.854278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.854344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:51.854419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.854550 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.854564 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.854624 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:53.854604346 +0000 UTC m=+41.197855802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.855023 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.855070 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:53.855055952 +0000 UTC m=+41.198307399 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:51:51.855166 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.855132 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:51.855766 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:51.855194 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:53.855176105 +0000 UTC m=+41.198427553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:51:52.351026 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:52.350980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" event={"ID":"8e46b3b0-2ff5-430f-acab-a20e11bb02d0","Type":"ContainerStarted","Data":"31255b154c4c5d228c93cf0ffca6e83e59b08589c75b18cb263d6e5e548c4c68"} Apr 17 07:51:52.375115 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:52.375060 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tbd6q" podStartSLOduration=5.411545429 podStartE2EDuration="39.37504097s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:51:15.944966226 +0000 UTC m=+3.288217675" lastFinishedPulling="2026-04-17 07:51:49.908461772 +0000 UTC m=+37.251713216" observedRunningTime="2026-04-17 07:51:52.373029354 +0000 UTC m=+39.716280819" watchObservedRunningTime="2026-04-17 07:51:52.37504097 +0000 UTC m=+39.718292433" Apr 17 07:51:52.562954 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:52.562740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:52.577307 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:52.577051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ce8efaa-a4ae-457a-b417-1aa180ef551f-original-pull-secret\") pod \"global-pull-secret-syncer-lm9nr\" (UID: \"9ce8efaa-a4ae-457a-b417-1aa180ef551f\") " pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:52.686327 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:52.686295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lm9nr" Apr 17 07:51:53.872586 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:53.872553 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:53.872651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:53.872701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872720 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872751 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872812 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872827 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:57.872804791 +0000 UTC m=+45.216056245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872851 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872870 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:57.87285291 +0000 UTC m=+45.216104364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:51:53.873188 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:53.872888 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:57.872878906 +0000 UTC m=+45.216130347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:51:56.792940 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:56.792731 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lm9nr"] Apr 17 07:51:56.799433 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:51:56.796552 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce8efaa_a4ae_457a_b417_1aa180ef551f.slice/crio-d146fa2ae854b1ee6af31d437781e5fdaa084ef2fc6aa1feb711f7e61f805c14 WatchSource:0}: Error finding container d146fa2ae854b1ee6af31d437781e5fdaa084ef2fc6aa1feb711f7e61f805c14: Status 404 returned error can't find the container with id d146fa2ae854b1ee6af31d437781e5fdaa084ef2fc6aa1feb711f7e61f805c14 Apr 17 07:51:57.364267 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.364212 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" event={"ID":"34a606cd-fd5c-4966-bcee-663ef861c625","Type":"ContainerStarted","Data":"ffdd8c6051afa2b5b7ee07fe5bba1444a52139d65467a8470430f11a3a4d723b"} Apr 17 07:51:57.366003 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.365974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" event={"ID":"8edee96b-f66b-4c8b-af3b-de25372ba529","Type":"ContainerStarted","Data":"7bb5478d1a265ec9b398900bab2834bb0b579e4b938b2c23702e8bf7a35c8f8e"} Apr 17 07:51:57.366204 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.366187 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:57.367327 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.367298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lm9nr" event={"ID":"9ce8efaa-a4ae-457a-b417-1aa180ef551f","Type":"ContainerStarted","Data":"d146fa2ae854b1ee6af31d437781e5fdaa084ef2fc6aa1feb711f7e61f805c14"} Apr 17 07:51:57.368176 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.368155 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:51:57.368880 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.368853 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" event={"ID":"5d8e5bc6-9fc7-432b-b033-23da46d4d68a","Type":"ContainerStarted","Data":"f93bd34da7437b45b6db5663994dfe60757f67531cdd9ab3d8a87756ae79a5e8"} Apr 17 07:51:57.379858 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.379819 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" podStartSLOduration=37.258132099 podStartE2EDuration="43.379807666s" podCreationTimestamp="2026-04-17 07:51:14 +0000 UTC" firstStartedPulling="2026-04-17 07:51:50.556929053 +0000 UTC m=+37.900180500" lastFinishedPulling="2026-04-17 07:51:56.678604608 +0000 UTC m=+44.021856067" observedRunningTime="2026-04-17 07:51:57.379560247 +0000 UTC m=+44.722811711" watchObservedRunningTime="2026-04-17 07:51:57.379807666 +0000 UTC m=+44.723059126" Apr 17 07:51:57.398107 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.398069 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" podStartSLOduration=37.284037262 podStartE2EDuration="43.398057353s" podCreationTimestamp="2026-04-17 07:51:14 +0000 UTC" firstStartedPulling="2026-04-17 07:51:50.55697907 +0000 UTC m=+37.900230525" lastFinishedPulling="2026-04-17 07:51:56.670999161 +0000 UTC m=+44.014250616" observedRunningTime="2026-04-17 07:51:57.397496444 +0000 UTC m=+44.740747907" watchObservedRunningTime="2026-04-17 07:51:57.398057353 +0000 UTC m=+44.741308845" Apr 17 07:51:57.902694 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.902663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.902740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:51:57.902784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902830 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902849 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902873 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902908 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.902891224 +0000 UTC m=+53.246142668 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902908 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902927 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.90291722 +0000 UTC m=+53.246168661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:51:57.903101 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:51:57.902959 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.902944176 +0000 UTC m=+53.246195619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:52:00.377708 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:00.377654 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" event={"ID":"5d8e5bc6-9fc7-432b-b033-23da46d4d68a","Type":"ContainerStarted","Data":"8f8742a723dc21bbb0f6f65fb4a42f326c7744ef6bcabd3796171c513e549619"} Apr 17 07:52:00.377708 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:00.377700 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" event={"ID":"5d8e5bc6-9fc7-432b-b033-23da46d4d68a","Type":"ContainerStarted","Data":"aa39925a280476768c26aff08b3f57c89133f00814eb55b1f2b9171d92a470ff"} Apr 17 07:52:00.396579 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:00.396540 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" podStartSLOduration=37.539055017 podStartE2EDuration="46.396527033s" podCreationTimestamp="2026-04-17 07:51:14 +0000 UTC" firstStartedPulling="2026-04-17 07:51:50.58862263 +0000 UTC m=+37.931874072" lastFinishedPulling="2026-04-17 07:51:59.446094633 +0000 UTC m=+46.789346088" observedRunningTime="2026-04-17 07:52:00.396034267 +0000 UTC m=+47.739285733" watchObservedRunningTime="2026-04-17 07:52:00.396527033 +0000 UTC m=+47.739778496" Apr 17 07:52:02.382621 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:02.382584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lm9nr" event={"ID":"9ce8efaa-a4ae-457a-b417-1aa180ef551f","Type":"ContainerStarted","Data":"4ad308d7b658ff8444604ee3ebca2ec0f79dfb2e1ec384590a53ce25cdd8ad0b"} Apr 17 07:52:05.958616 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:05.958584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:05.958632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:05.958658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958734 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958791 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:21.958777654 +0000 UTC m=+69.302029095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958738 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958812 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958737 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958835 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:52:21.95882803 +0000 UTC m=+69.302079471 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:52:05.959148 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:05.958867 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:21.958856722 +0000 UTC m=+69.302108163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:52:15.329870 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:15.329838 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwtmq" Apr 17 07:52:15.356294 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:15.356247 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lm9nr" podStartSLOduration=50.511480685 podStartE2EDuration="55.356233269s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:56.798130625 +0000 UTC m=+44.141382066" lastFinishedPulling="2026-04-17 07:52:01.64288321 +0000 UTC m=+48.986134650" observedRunningTime="2026-04-17 07:52:02.397454462 +0000 UTC m=+49.740705924" watchObservedRunningTime="2026-04-17 07:52:15.356233269 +0000 UTC m=+62.699484735" Apr 17 07:52:18.850673 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:18.850640 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:52:18.853462 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:18.853443 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:18.861503 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:18.861486 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:18.861590 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:18.861553 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:22.8615316 +0000 UTC m=+130.204783041 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : secret "metrics-daemon-secret" not found Apr 17 07:52:18.951863 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:18.951837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:52:18.954511 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:18.954494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:18.965003 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:18.964983 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:18.977188 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:18.977164 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rr5c\" (UniqueName: \"kubernetes.io/projected/129c1cb1-8484-40fe-b434-4354aab1d142-kube-api-access-8rr5c\") pod \"network-check-target-t6snn\" (UID: \"129c1cb1-8484-40fe-b434-4354aab1d142\") " pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:52:19.096831 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:19.096809 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bmlbc\"" Apr 17 07:52:19.105098 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:19.105054 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:52:19.218497 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:19.218464 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t6snn"] Apr 17 07:52:19.221145 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:52:19.221118 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod129c1cb1_8484_40fe_b434_4354aab1d142.slice/crio-9c7208863daca34697dd7a04c2b0b2976f8a5430c3f31949de74cd64d807c94f WatchSource:0}: Error finding container 9c7208863daca34697dd7a04c2b0b2976f8a5430c3f31949de74cd64d807c94f: Status 404 returned error can't find the container with id 9c7208863daca34697dd7a04c2b0b2976f8a5430c3f31949de74cd64d807c94f Apr 17 07:52:19.430157 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:19.430093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t6snn" event={"ID":"129c1cb1-8484-40fe-b434-4354aab1d142","Type":"ContainerStarted","Data":"9c7208863daca34697dd7a04c2b0b2976f8a5430c3f31949de74cd64d807c94f"} Apr 17 07:52:21.975503 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:21.975472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:21.975517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:21.975557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975622 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975642 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975690 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.97567292 +0000 UTC m=+101.318924362 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975698 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975718 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975705 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.975698885 +0000 UTC m=+101.318950325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:52:21.976039 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:21.975814 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.975795255 +0000 UTC m=+101.319046704 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:52:22.437410 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:22.437357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t6snn" event={"ID":"129c1cb1-8484-40fe-b434-4354aab1d142","Type":"ContainerStarted","Data":"fb3261e1397d0b772b0fecc8ac7f7540dc3bc4357463cb67b13c1e8055004aff"} Apr 17 07:52:22.437627 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:22.437605 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:52:22.453484 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:22.453441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t6snn" podStartSLOduration=66.607496153 podStartE2EDuration="1m9.45342819s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:19.222963876 +0000 UTC m=+66.566215317" lastFinishedPulling="2026-04-17 07:52:22.068895913 +0000 UTC m=+69.412147354" observedRunningTime="2026-04-17 07:52:22.452522668 +0000 UTC m=+69.795774133" watchObservedRunningTime="2026-04-17 07:52:22.45342819 +0000 UTC m=+69.796679687" Apr 17 07:52:53.442608 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:53.442585 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t6snn" Apr 17 07:52:53.989894 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:53.989861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:53.989910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:52:53.989945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990014 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990030 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990080 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls podName:bcb3e19a-a695-43e0-bfdc-31eb223b9c0c nodeName:}" failed. No retries permitted until 2026-04-17 07:53:57.990065313 +0000 UTC m=+165.333316754 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls") pod "dns-default-sxlhk" (UID: "bcb3e19a-a695-43e0-bfdc-31eb223b9c0c") : secret "dns-default-metrics-tls" not found Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990082 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:53.990090 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990094 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert podName:ce7f44b6-f8a9-4abf-b749-3dc63b29e396 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:57.990088683 +0000 UTC m=+165.333340125 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert") pod "ingress-canary-pw7cp" (UID: "ce7f44b6-f8a9-4abf-b749-3dc63b29e396") : secret "canary-serving-cert" not found Apr 17 07:52:53.990409 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990099 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd847b7b9-j4hg8: secret "image-registry-tls" not found Apr 17 07:52:53.990409 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:52:53.990161 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls podName:8b907fba-a1e3-48d5-ac78-6b5209129b2b nodeName:}" failed. No retries permitted until 2026-04-17 07:53:57.990144298 +0000 UTC m=+165.333395739 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls") pod "image-registry-5cd847b7b9-j4hg8" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b") : secret "image-registry-tls" not found Apr 17 07:53:16.661315 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:16.661284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hslwj_75720dfb-fe8b-42c4-b690-1725be056c2e/dns-node-resolver/0.log" Apr 17 07:53:17.661186 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:17.661155 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qd97n_8318f513-f957-48a0-821d-d6718c08e6cb/node-ca/0.log" Apr 17 07:53:22.890250 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:22.890215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:53:22.890767 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:22.890393 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:22.890767 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:22.890500 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs podName:7e501027-4496-4a12-a9d5-fc5c57942102 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:24.89046574 +0000 UTC m=+252.233717185 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs") pod "network-metrics-daemon-jvr45" (UID: "7e501027-4496-4a12-a9d5-fc5c57942102") : secret "metrics-daemon-secret" not found Apr 17 07:53:30.417334 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:30.417269 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" podUID="5d8e5bc6-9fc7-432b-b033-23da46d4d68a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 07:53:40.417180 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:40.417133 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" podUID="5d8e5bc6-9fc7-432b-b033-23da46d4d68a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 07:53:41.208393 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.208341 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q4lq6"] Apr 17 07:53:41.210492 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.210476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.219980 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.219960 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:53:41.224547 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.224533 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:53:41.224811 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.224797 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:53:41.225322 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.225308 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:53:41.225813 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.225797 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6r99z\"" Apr 17 07:53:41.255357 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.255335 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q4lq6"] Apr 17 07:53:41.314880 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.314854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d17df228-c496-4bd6-9af7-4ceb036c7530-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.314987 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.314884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d17df228-c496-4bd6-9af7-4ceb036c7530-crio-socket\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.314987 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.314905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d17df228-c496-4bd6-9af7-4ceb036c7530-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.314987 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.314928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d17df228-c496-4bd6-9af7-4ceb036c7530-data-volume\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.315155 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.315090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp5v\" (UniqueName: \"kubernetes.io/projected/d17df228-c496-4bd6-9af7-4ceb036c7530-kube-api-access-vhp5v\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.415922 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.415898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d17df228-c496-4bd6-9af7-4ceb036c7530-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416013 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.415932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d17df228-c496-4bd6-9af7-4ceb036c7530-data-volume\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416071 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.416021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhp5v\" (UniqueName: \"kubernetes.io/projected/d17df228-c496-4bd6-9af7-4ceb036c7530-kube-api-access-vhp5v\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416071 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.416049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d17df228-c496-4bd6-9af7-4ceb036c7530-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416169 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.416068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d17df228-c496-4bd6-9af7-4ceb036c7530-crio-socket\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416169 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.416147 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d17df228-c496-4bd6-9af7-4ceb036c7530-crio-socket\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416349 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.416323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d17df228-c496-4bd6-9af7-4ceb036c7530-data-volume\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.416437 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.416406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d17df228-c496-4bd6-9af7-4ceb036c7530-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.418375 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.418355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d17df228-c496-4bd6-9af7-4ceb036c7530-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.426346 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.426327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhp5v\" (UniqueName: \"kubernetes.io/projected/d17df228-c496-4bd6-9af7-4ceb036c7530-kube-api-access-vhp5v\") pod \"insights-runtime-extractor-q4lq6\" (UID: \"d17df228-c496-4bd6-9af7-4ceb036c7530\") " pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.518935 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.518881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q4lq6" Apr 17 07:53:41.638063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:41.638034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q4lq6"] Apr 17 07:53:41.641049 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:53:41.641021 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17df228_c496_4bd6_9af7_4ceb036c7530.slice/crio-f33ed8540b7502a4b7a08c8c9018c376eea604549606bf1984184e07ed4753bc WatchSource:0}: Error finding container f33ed8540b7502a4b7a08c8c9018c376eea604549606bf1984184e07ed4753bc: Status 404 returned error can't find the container with id f33ed8540b7502a4b7a08c8c9018c376eea604549606bf1984184e07ed4753bc Apr 17 07:53:42.629502 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:42.629425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4lq6" event={"ID":"d17df228-c496-4bd6-9af7-4ceb036c7530","Type":"ContainerStarted","Data":"6d7fb9a37a5b0d6ae14559108acc8d5ad396ddb6fd11d8b518658013d5574ca7"} Apr 17 07:53:42.629502 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:42.629461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4lq6" event={"ID":"d17df228-c496-4bd6-9af7-4ceb036c7530","Type":"ContainerStarted","Data":"18f75bc2e9695ac90d449d43dc1c1b27ece882ceba42088a16a53c60889b3d78"} Apr 17 07:53:42.629502 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:42.629470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4lq6" event={"ID":"d17df228-c496-4bd6-9af7-4ceb036c7530","Type":"ContainerStarted","Data":"f33ed8540b7502a4b7a08c8c9018c376eea604549606bf1984184e07ed4753bc"} Apr 17 07:53:44.637823 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:44.637775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4lq6" event={"ID":"d17df228-c496-4bd6-9af7-4ceb036c7530","Type":"ContainerStarted","Data":"6330b651cf2d6de4afe3ed2bcd5635c528f7a0db08c3b8019040eeac3dbf0636"} Apr 17 07:53:44.654796 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:44.654742 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q4lq6" podStartSLOduration=1.421311266 podStartE2EDuration="3.654728027s" podCreationTimestamp="2026-04-17 07:53:41 +0000 UTC" firstStartedPulling="2026-04-17 07:53:41.689775768 +0000 UTC m=+149.033027210" lastFinishedPulling="2026-04-17 07:53:43.923192529 +0000 UTC m=+151.266443971" observedRunningTime="2026-04-17 07:53:44.653793528 +0000 UTC m=+151.997044991" watchObservedRunningTime="2026-04-17 07:53:44.654728027 +0000 UTC m=+151.997979490" Apr 17 07:53:50.417736 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.417696 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" podUID="5d8e5bc6-9fc7-432b-b033-23da46d4d68a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 07:53:50.418217 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.417760 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" Apr 17 07:53:50.418217 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.418190 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8f8742a723dc21bbb0f6f65fb4a42f326c7744ef6bcabd3796171c513e549619"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 07:53:50.418333 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.418240 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" podUID="5d8e5bc6-9fc7-432b-b033-23da46d4d68a" containerName="service-proxy" containerID="cri-o://8f8742a723dc21bbb0f6f65fb4a42f326c7744ef6bcabd3796171c513e549619" gracePeriod=30 Apr 17 07:53:50.657714 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.657681 2573 generic.go:358] "Generic (PLEG): container finished" podID="5d8e5bc6-9fc7-432b-b033-23da46d4d68a" containerID="8f8742a723dc21bbb0f6f65fb4a42f326c7744ef6bcabd3796171c513e549619" exitCode=2 Apr 17 07:53:50.657881 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.657742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" event={"ID":"5d8e5bc6-9fc7-432b-b033-23da46d4d68a","Type":"ContainerDied","Data":"8f8742a723dc21bbb0f6f65fb4a42f326c7744ef6bcabd3796171c513e549619"} Apr 17 07:53:50.657881 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:50.657779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695ffd4769-lmcwx" event={"ID":"5d8e5bc6-9fc7-432b-b033-23da46d4d68a","Type":"ContainerStarted","Data":"976e550478baeb9d04452ab19c60d2d50223fff91d448b2f3bd6a88c30f21515"} Apr 17 07:53:53.066872 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:53.066783 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" Apr 17 07:53:53.149671 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:53.149639 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pw7cp" podUID="ce7f44b6-f8a9-4abf-b749-3dc63b29e396" Apr 17 07:53:53.154830 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:53.154805 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sxlhk" podUID="bcb3e19a-a695-43e0-bfdc-31eb223b9c0c" Apr 17 07:53:53.664442 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:53.664412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:53:53.664618 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:53.664412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:53:54.178907 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:54.178870 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jvr45" podUID="7e501027-4496-4a12-a9d5-fc5c57942102" Apr 17 07:53:54.781712 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.781683 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nznm9"] Apr 17 07:53:54.783679 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.783663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.786365 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.786334 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:53:54.786495 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.786400 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:53:54.786666 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.786643 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s4jk5\"" Apr 17 07:53:54.787409 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.787366 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:53:54.787721 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.787696 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:53:54.788143 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.788105 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:53:54.788859 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.788839 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:53:54.808772 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-sys\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.808867 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf17a6b-1e32-46a8-b120-2174e7e517b3-metrics-client-ca\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.808867 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-root\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.808867 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808820 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-wtmp\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.809063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-tls\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.809063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808918 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jcmm\" (UniqueName: \"kubernetes.io/projected/3bf17a6b-1e32-46a8-b120-2174e7e517b3-kube-api-access-6jcmm\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.809063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-textfile\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.809063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.808978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-accelerators-collector-config\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.809063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.809035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.909968 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.909944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-sys\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910065 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.909976 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf17a6b-1e32-46a8-b120-2174e7e517b3-metrics-client-ca\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910065 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.909999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-root\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910065 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-wtmp\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-root\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-tls\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-sys\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jcmm\" (UniqueName: \"kubernetes.io/projected/3bf17a6b-1e32-46a8-b120-2174e7e517b3-kube-api-access-6jcmm\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:54.910173 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-textfile\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-wtmp\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-accelerators-collector-config\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910224 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:53:54.910226 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-tls podName:3bf17a6b-1e32-46a8-b120-2174e7e517b3 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:55.41021199 +0000 UTC m=+162.753463430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-tls") pod "node-exporter-nznm9" (UID: "3bf17a6b-1e32-46a8-b120-2174e7e517b3") : secret "node-exporter-tls" not found Apr 17 07:53:54.910668 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910668 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf17a6b-1e32-46a8-b120-2174e7e517b3-metrics-client-ca\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910668 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-textfile\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.910854 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.910834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-accelerators-collector-config\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.912628 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.912611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:54.918647 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:54.918628 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jcmm\" (UniqueName: \"kubernetes.io/projected/3bf17a6b-1e32-46a8-b120-2174e7e517b3-kube-api-access-6jcmm\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:55.413736 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:55.413705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-tls\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:55.416143 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:55.416125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3bf17a6b-1e32-46a8-b120-2174e7e517b3-node-exporter-tls\") pod \"node-exporter-nznm9\" (UID: \"3bf17a6b-1e32-46a8-b120-2174e7e517b3\") " pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:55.696642 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:55.696619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nznm9" Apr 17 07:53:55.704354 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:53:55.704330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf17a6b_1e32_46a8_b120_2174e7e517b3.slice/crio-7121f07fe81b4f75d9120a96b110017c815f1fc7e03f8e49eb747d6ce90749d8 WatchSource:0}: Error finding container 7121f07fe81b4f75d9120a96b110017c815f1fc7e03f8e49eb747d6ce90749d8: Status 404 returned error can't find the container with id 7121f07fe81b4f75d9120a96b110017c815f1fc7e03f8e49eb747d6ce90749d8 Apr 17 07:53:56.671791 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:56.671750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nznm9" event={"ID":"3bf17a6b-1e32-46a8-b120-2174e7e517b3","Type":"ContainerStarted","Data":"7121f07fe81b4f75d9120a96b110017c815f1fc7e03f8e49eb747d6ce90749d8"} Apr 17 07:53:57.367085 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.367047 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" podUID="8edee96b-f66b-4c8b-af3b-de25372ba529" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 17 07:53:57.675339 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.675260 2573 generic.go:358] "Generic (PLEG): container finished" podID="3bf17a6b-1e32-46a8-b120-2174e7e517b3" containerID="b99623a4c0018c654260fb91c7ca72c683e35c5d6fee3fbe255b9d19481a2285" exitCode=0 Apr 17 07:53:57.675797 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.675346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nznm9" event={"ID":"3bf17a6b-1e32-46a8-b120-2174e7e517b3","Type":"ContainerDied","Data":"b99623a4c0018c654260fb91c7ca72c683e35c5d6fee3fbe255b9d19481a2285"} Apr 17 07:53:57.676644 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.676627 2573 generic.go:358] "Generic (PLEG): container finished" podID="34a606cd-fd5c-4966-bcee-663ef861c625" containerID="ffdd8c6051afa2b5b7ee07fe5bba1444a52139d65467a8470430f11a3a4d723b" exitCode=255 Apr 17 07:53:57.676726 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.676695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" event={"ID":"34a606cd-fd5c-4966-bcee-663ef861c625","Type":"ContainerDied","Data":"ffdd8c6051afa2b5b7ee07fe5bba1444a52139d65467a8470430f11a3a4d723b"} Apr 17 07:53:57.676987 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.676972 2573 scope.go:117] "RemoveContainer" containerID="ffdd8c6051afa2b5b7ee07fe5bba1444a52139d65467a8470430f11a3a4d723b" Apr 17 07:53:57.677968 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.677947 2573 generic.go:358] "Generic (PLEG): container finished" podID="8edee96b-f66b-4c8b-af3b-de25372ba529" containerID="7bb5478d1a265ec9b398900bab2834bb0b579e4b938b2c23702e8bf7a35c8f8e" exitCode=1 Apr 17 07:53:57.678066 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.677968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" event={"ID":"8edee96b-f66b-4c8b-af3b-de25372ba529","Type":"ContainerDied","Data":"7bb5478d1a265ec9b398900bab2834bb0b579e4b938b2c23702e8bf7a35c8f8e"} Apr 17 07:53:57.678293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:57.678274 2573 scope.go:117] "RemoveContainer" containerID="7bb5478d1a265ec9b398900bab2834bb0b579e4b938b2c23702e8bf7a35c8f8e" Apr 17 07:53:58.032889 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.032862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:53:58.033045 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.032896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:53:58.033045 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.032948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:53:58.035343 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.035325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcb3e19a-a695-43e0-bfdc-31eb223b9c0c-metrics-tls\") pod \"dns-default-sxlhk\" (UID: \"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c\") " pod="openshift-dns/dns-default-sxlhk" Apr 17 07:53:58.035505 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.035488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7f44b6-f8a9-4abf-b749-3dc63b29e396-cert\") pod \"ingress-canary-pw7cp\" (UID: \"ce7f44b6-f8a9-4abf-b749-3dc63b29e396\") " pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:53:58.035610 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.035589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"image-registry-5cd847b7b9-j4hg8\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:53:58.167559 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.167534 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6xw8q\"" Apr 17 07:53:58.167675 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.167657 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9vrx5\"" Apr 17 07:53:58.175617 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.175593 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pw7cp" Apr 17 07:53:58.175663 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.175622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:53:58.300923 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.300861 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pw7cp"] Apr 17 07:53:58.303750 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:53:58.303721 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7f44b6_f8a9_4abf_b749_3dc63b29e396.slice/crio-57217037af874c98e5a27e00e1a60f69e5896e12743968b39d58ff7b1d51b23a WatchSource:0}: Error finding container 57217037af874c98e5a27e00e1a60f69e5896e12743968b39d58ff7b1d51b23a: Status 404 returned error can't find the container with id 57217037af874c98e5a27e00e1a60f69e5896e12743968b39d58ff7b1d51b23a Apr 17 07:53:58.320994 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.320976 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cd847b7b9-j4hg8"] Apr 17 07:53:58.322901 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:53:58.322876 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b907fba_a1e3_48d5_ac78_6b5209129b2b.slice/crio-cae06905a530035a503776857a793243c2b5b9416ccec27da6043c008106c529 WatchSource:0}: Error finding container cae06905a530035a503776857a793243c2b5b9416ccec27da6043c008106c529: Status 404 returned error can't find the container with id cae06905a530035a503776857a793243c2b5b9416ccec27da6043c008106c529 Apr 17 07:53:58.682431 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.682373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" event={"ID":"8edee96b-f66b-4c8b-af3b-de25372ba529","Type":"ContainerStarted","Data":"a3c0427c3b517963acffc1280aebe67c78c660349b468f09e1d0e4a51a7bd013"} Apr 17 07:53:58.682849 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.682715 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:53:58.683455 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.683430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d8c787ff6-xwqxg" Apr 17 07:53:58.683567 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.683548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pw7cp" event={"ID":"ce7f44b6-f8a9-4abf-b749-3dc63b29e396","Type":"ContainerStarted","Data":"57217037af874c98e5a27e00e1a60f69e5896e12743968b39d58ff7b1d51b23a"} Apr 17 07:53:58.685301 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.685281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nznm9" event={"ID":"3bf17a6b-1e32-46a8-b120-2174e7e517b3","Type":"ContainerStarted","Data":"0c07462fcdce1924c8c3c407f0983fe0d5889ac9f2dcd154da22947e502ba68f"} Apr 17 07:53:58.685410 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.685305 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nznm9" event={"ID":"3bf17a6b-1e32-46a8-b120-2174e7e517b3","Type":"ContainerStarted","Data":"6916602c49b2a2460ac4aa1833c0f88bf98b366eec6d615eee2b40ad8a53667e"} Apr 17 07:53:58.686591 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.686570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" event={"ID":"8b907fba-a1e3-48d5-ac78-6b5209129b2b","Type":"ContainerStarted","Data":"a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422"} Apr 17 07:53:58.686680 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.686618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" event={"ID":"8b907fba-a1e3-48d5-ac78-6b5209129b2b","Type":"ContainerStarted","Data":"cae06905a530035a503776857a793243c2b5b9416ccec27da6043c008106c529"} Apr 17 07:53:58.686680 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.686667 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:53:58.688070 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.688054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f6555b97-p2pm7" event={"ID":"34a606cd-fd5c-4966-bcee-663ef861c625","Type":"ContainerStarted","Data":"8ee323baaf7508d5807a7784ada018f80d30e7b2a98674c15189f4a234eb71ef"} Apr 17 07:53:58.745276 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.745233 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nznm9" podStartSLOduration=3.754415201 podStartE2EDuration="4.745219147s" podCreationTimestamp="2026-04-17 07:53:54 +0000 UTC" firstStartedPulling="2026-04-17 07:53:55.706309261 +0000 UTC m=+163.049560701" lastFinishedPulling="2026-04-17 07:53:56.697113192 +0000 UTC m=+164.040364647" observedRunningTime="2026-04-17 07:53:58.744682639 +0000 UTC m=+166.087934101" watchObservedRunningTime="2026-04-17 07:53:58.745219147 +0000 UTC m=+166.088470611" Apr 17 07:53:58.745460 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:53:58.745439 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" podStartSLOduration=165.745434022 podStartE2EDuration="2m45.745434022s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:53:58.729047772 +0000 UTC m=+166.072299235" watchObservedRunningTime="2026-04-17 07:53:58.745434022 +0000 UTC m=+166.088685485" Apr 17 07:54:00.695053 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:00.695020 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pw7cp" event={"ID":"ce7f44b6-f8a9-4abf-b749-3dc63b29e396","Type":"ContainerStarted","Data":"29968e4be79196ddd829e01a44ef839ac006a10f29b33233871c1d33c7efc65d"} Apr 17 07:54:00.709218 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:00.709180 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pw7cp" podStartSLOduration=128.734489174 podStartE2EDuration="2m10.709165955s" podCreationTimestamp="2026-04-17 07:51:50 +0000 UTC" firstStartedPulling="2026-04-17 07:53:58.306000267 +0000 UTC m=+165.649251709" lastFinishedPulling="2026-04-17 07:54:00.280677049 +0000 UTC m=+167.623928490" observedRunningTime="2026-04-17 07:54:00.708932719 +0000 UTC m=+168.052184186" watchObservedRunningTime="2026-04-17 07:54:00.709165955 +0000 UTC m=+168.052417418" Apr 17 07:54:05.160019 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:05.159930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sxlhk" Apr 17 07:54:05.163060 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:05.163033 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mw88v\"" Apr 17 07:54:05.170589 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:05.170570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sxlhk" Apr 17 07:54:05.287840 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:05.287812 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sxlhk"] Apr 17 07:54:05.291032 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:54:05.291008 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb3e19a_a695_43e0_bfdc_31eb223b9c0c.slice/crio-b4662a476ebcc56e0947a664d2a024e1b6ec2e7059dd295b7585edba6560f765 WatchSource:0}: Error finding container b4662a476ebcc56e0947a664d2a024e1b6ec2e7059dd295b7585edba6560f765: Status 404 returned error can't find the container with id b4662a476ebcc56e0947a664d2a024e1b6ec2e7059dd295b7585edba6560f765 Apr 17 07:54:05.710397 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:05.710343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sxlhk" event={"ID":"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c","Type":"ContainerStarted","Data":"b4662a476ebcc56e0947a664d2a024e1b6ec2e7059dd295b7585edba6560f765"} Apr 17 07:54:06.714607 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:06.714537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sxlhk" event={"ID":"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c","Type":"ContainerStarted","Data":"6ae4b50c71f98ddac74eb42467681f78d6f210b02f2396821845caf33845e968"} Apr 17 07:54:06.714607 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:06.714572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sxlhk" event={"ID":"bcb3e19a-a695-43e0-bfdc-31eb223b9c0c","Type":"ContainerStarted","Data":"52cb5da09117f6acec9b24e7b5d3e5aedb3220c0b696d84196ed00ae40ed3fb4"} Apr 17 07:54:06.714924 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:06.714667 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sxlhk" Apr 17 07:54:06.732284 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:06.732239 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sxlhk" podStartSLOduration=135.586817112 podStartE2EDuration="2m16.732229086s" podCreationTimestamp="2026-04-17 07:51:50 +0000 UTC" firstStartedPulling="2026-04-17 07:54:05.292880115 +0000 UTC m=+172.636131559" lastFinishedPulling="2026-04-17 07:54:06.438292092 +0000 UTC m=+173.781543533" observedRunningTime="2026-04-17 07:54:06.731203712 +0000 UTC m=+174.074455174" watchObservedRunningTime="2026-04-17 07:54:06.732229086 +0000 UTC m=+174.075480526" Apr 17 07:54:09.160140 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:09.160103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:54:16.720035 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:16.720005 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sxlhk" Apr 17 07:54:18.179663 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:18.179632 2573 patch_prober.go:28] interesting pod/image-registry-5cd847b7b9-j4hg8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:18.180072 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:18.179687 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:19.695293 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:19.695269 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:54:31.591585 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:31.591553 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cd847b7b9-j4hg8"] Apr 17 07:54:56.610216 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:56.610140 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" containerName="registry" containerID="cri-o://a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422" gracePeriod=30 Apr 17 07:54:57.837693 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.837672 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:54:57.856024 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.855999 2573 generic.go:358] "Generic (PLEG): container finished" podID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" containerID="a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422" exitCode=0 Apr 17 07:54:57.856140 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.856033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" event={"ID":"8b907fba-a1e3-48d5-ac78-6b5209129b2b","Type":"ContainerDied","Data":"a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422"} Apr 17 07:54:57.856140 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.856052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" event={"ID":"8b907fba-a1e3-48d5-ac78-6b5209129b2b","Type":"ContainerDied","Data":"cae06905a530035a503776857a793243c2b5b9416ccec27da6043c008106c529"} Apr 17 07:54:57.856140 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.856051 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd847b7b9-j4hg8" Apr 17 07:54:57.856140 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.856067 2573 scope.go:117] "RemoveContainer" containerID="a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422" Apr 17 07:54:57.865436 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.865415 2573 scope.go:117] "RemoveContainer" containerID="a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422" Apr 17 07:54:57.865734 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:54:57.865711 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422\": container with ID starting with a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422 not found: ID does not exist" containerID="a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422" Apr 17 07:54:57.865841 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.865744 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422"} err="failed to get container status \"a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422\": rpc error: code = NotFound desc = could not find container \"a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422\": container with ID starting with a4ea74eb066d70a5589257bdd40ebc0b1de1b79cad83135ba7b745ee334e9422 not found: ID does not exist" Apr 17 07:54:57.963301 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963273 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk42j\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-kube-api-access-gk42j\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963600 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963307 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-trusted-ca\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963600 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963325 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-certificates\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963600 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963345 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b907fba-a1e3-48d5-ac78-6b5209129b2b-ca-trust-extracted\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963600 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963535 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-installation-pull-secrets\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963827 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963600 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-bound-sa-token\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963827 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963637 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963827 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963682 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-image-registry-private-configuration\") pod \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\" (UID: \"8b907fba-a1e3-48d5-ac78-6b5209129b2b\") " Apr 17 07:54:57.963973 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963824 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:57.963973 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963837 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:57.963973 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963935 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-trusted-ca\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:57.963973 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.963957 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-certificates\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:57.966088 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.966041 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:57.966207 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.966180 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:57.966269 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.966226 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-kube-api-access-gk42j" (OuterVolumeSpecName: "kube-api-access-gk42j") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "kube-api-access-gk42j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:57.966327 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.966298 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:57.966477 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.966453 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:57.972274 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:57.972249 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b907fba-a1e3-48d5-ac78-6b5209129b2b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8b907fba-a1e3-48d5-ac78-6b5209129b2b" (UID: "8b907fba-a1e3-48d5-ac78-6b5209129b2b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:54:58.065100 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.065074 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-image-registry-private-configuration\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:58.065100 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.065096 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gk42j\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-kube-api-access-gk42j\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:58.065229 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.065106 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b907fba-a1e3-48d5-ac78-6b5209129b2b-ca-trust-extracted\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:58.065229 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.065115 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b907fba-a1e3-48d5-ac78-6b5209129b2b-installation-pull-secrets\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:58.065229 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.065124 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-bound-sa-token\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:58.065229 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.065133 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b907fba-a1e3-48d5-ac78-6b5209129b2b-registry-tls\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:54:58.176255 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.176233 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cd847b7b9-j4hg8"] Apr 17 07:54:58.186472 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:58.186452 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5cd847b7b9-j4hg8"] Apr 17 07:54:59.164373 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:54:59.164338 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" path="/var/lib/kubelet/pods/8b907fba-a1e3-48d5-ac78-6b5209129b2b/volumes" Apr 17 07:55:24.951327 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:24.951289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:55:24.953987 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:24.953959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e501027-4496-4a12-a9d5-fc5c57942102-metrics-certs\") pod \"network-metrics-daemon-jvr45\" (UID: \"7e501027-4496-4a12-a9d5-fc5c57942102\") " pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:55:25.063767 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:25.063739 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjzzs\"" Apr 17 07:55:25.070909 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:25.070885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jvr45" Apr 17 07:55:25.190119 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:25.190093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jvr45"] Apr 17 07:55:25.192965 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:55:25.192935 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e501027_4496_4a12_a9d5_fc5c57942102.slice/crio-cbc4355faf2b806fde32fa471a09b9a19c9e431510deff80e014e23c825a1953 WatchSource:0}: Error finding container cbc4355faf2b806fde32fa471a09b9a19c9e431510deff80e014e23c825a1953: Status 404 returned error can't find the container with id cbc4355faf2b806fde32fa471a09b9a19c9e431510deff80e014e23c825a1953 Apr 17 07:55:25.934167 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:25.934133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jvr45" event={"ID":"7e501027-4496-4a12-a9d5-fc5c57942102","Type":"ContainerStarted","Data":"cbc4355faf2b806fde32fa471a09b9a19c9e431510deff80e014e23c825a1953"} Apr 17 07:55:26.938610 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:26.938577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jvr45" event={"ID":"7e501027-4496-4a12-a9d5-fc5c57942102","Type":"ContainerStarted","Data":"6a399d2933b0bd635a9fad5bde369483d56636f6fd89cc13d182be4d05bcf5c4"} Apr 17 07:55:26.938610 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:26.938615 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jvr45" event={"ID":"7e501027-4496-4a12-a9d5-fc5c57942102","Type":"ContainerStarted","Data":"f18b45994e9e56f7a13135173f006f7c6546c02b8e40c8614ed4ffafa0244a75"} Apr 17 07:55:26.959311 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:55:26.959256 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jvr45" podStartSLOduration=252.959470878 podStartE2EDuration="4m13.95924481s" podCreationTimestamp="2026-04-17 07:51:13 +0000 UTC" firstStartedPulling="2026-04-17 07:55:25.194687038 +0000 UTC m=+252.537938479" lastFinishedPulling="2026-04-17 07:55:26.194460967 +0000 UTC m=+253.537712411" observedRunningTime="2026-04-17 07:55:26.954051382 +0000 UTC m=+254.297302901" watchObservedRunningTime="2026-04-17 07:55:26.95924481 +0000 UTC m=+254.302496273" Apr 17 07:56:13.059502 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:56:13.059474 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:57:39.791184 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.791149 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-c29wq"] Apr 17 07:57:39.791769 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.791484 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" containerName="registry" Apr 17 07:57:39.791769 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.791502 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" containerName="registry" Apr 17 07:57:39.791769 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.791569 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b907fba-a1e3-48d5-ac78-6b5209129b2b" containerName="registry" Apr 17 07:57:39.794311 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.794292 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:39.795545 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.795524 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-phzzw"] Apr 17 07:57:39.797237 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.797216 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 07:57:39.797323 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.797261 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 07:57:39.797390 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.797356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 07:57:39.798036 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.798023 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-pwwf8\"" Apr 17 07:57:39.798204 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.798190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:39.800238 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.800221 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 07:57:39.800311 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.800225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-c7kc5\"" Apr 17 07:57:39.805944 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.805926 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-c29wq"] Apr 17 07:57:39.808906 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.808675 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-phzzw"] Apr 17 07:57:39.922554 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.922531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mft57\" (UniqueName: \"kubernetes.io/projected/d18f0e34-2810-43cf-b2b3-e14387d086b9-kube-api-access-mft57\") pod \"kserve-controller-manager-558564fd68-c29wq\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:39.922663 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.922565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d18f0e34-2810-43cf-b2b3-e14387d086b9-cert\") pod \"kserve-controller-manager-558564fd68-c29wq\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:39.922663 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.922592 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a25fee9-99e3-4187-b81d-e2f9802f42d2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:39.922663 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:39.922616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mrh\" (UniqueName: \"kubernetes.io/projected/8a25fee9-99e3-4187-b81d-e2f9802f42d2-kube-api-access-78mrh\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.023527 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.023506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mft57\" (UniqueName: \"kubernetes.io/projected/d18f0e34-2810-43cf-b2b3-e14387d086b9-kube-api-access-mft57\") pod \"kserve-controller-manager-558564fd68-c29wq\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:40.023635 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.023544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d18f0e34-2810-43cf-b2b3-e14387d086b9-cert\") pod \"kserve-controller-manager-558564fd68-c29wq\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:40.023635 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.023579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a25fee9-99e3-4187-b81d-e2f9802f42d2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.023635 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.023606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78mrh\" (UniqueName: \"kubernetes.io/projected/8a25fee9-99e3-4187-b81d-e2f9802f42d2-kube-api-access-78mrh\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.023747 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:57:40.023727 2573 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 17 07:57:40.023815 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:57:40.023804 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25fee9-99e3-4187-b81d-e2f9802f42d2-cert podName:8a25fee9-99e3-4187-b81d-e2f9802f42d2 nodeName:}" failed. No retries permitted until 2026-04-17 07:57:40.523783187 +0000 UTC m=+387.867034630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a25fee9-99e3-4187-b81d-e2f9802f42d2-cert") pod "llmisvc-controller-manager-68cc5db7c4-phzzw" (UID: "8a25fee9-99e3-4187-b81d-e2f9802f42d2") : secret "llmisvc-webhook-server-cert" not found Apr 17 07:57:40.026026 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.026009 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d18f0e34-2810-43cf-b2b3-e14387d086b9-cert\") pod \"kserve-controller-manager-558564fd68-c29wq\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:40.031991 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.031967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mft57\" (UniqueName: \"kubernetes.io/projected/d18f0e34-2810-43cf-b2b3-e14387d086b9-kube-api-access-mft57\") pod \"kserve-controller-manager-558564fd68-c29wq\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:40.032312 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.032290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mrh\" (UniqueName: \"kubernetes.io/projected/8a25fee9-99e3-4187-b81d-e2f9802f42d2-kube-api-access-78mrh\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.106087 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.106017 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:40.218105 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.218074 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-c29wq"] Apr 17 07:57:40.220959 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:57:40.220932 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18f0e34_2810_43cf_b2b3_e14387d086b9.slice/crio-a367ac59570df6b6523164571a6295a9c44df1add8f28b20a5688bf371bb0036 WatchSource:0}: Error finding container a367ac59570df6b6523164571a6295a9c44df1add8f28b20a5688bf371bb0036: Status 404 returned error can't find the container with id a367ac59570df6b6523164571a6295a9c44df1add8f28b20a5688bf371bb0036 Apr 17 07:57:40.222096 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.222079 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:57:40.291165 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.291135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-c29wq" event={"ID":"d18f0e34-2810-43cf-b2b3-e14387d086b9","Type":"ContainerStarted","Data":"a367ac59570df6b6523164571a6295a9c44df1add8f28b20a5688bf371bb0036"} Apr 17 07:57:40.527268 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.527241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a25fee9-99e3-4187-b81d-e2f9802f42d2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.529720 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.529699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a25fee9-99e3-4187-b81d-e2f9802f42d2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-phzzw\" (UID: \"8a25fee9-99e3-4187-b81d-e2f9802f42d2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.712108 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.712060 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:40.850915 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:40.850862 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-phzzw"] Apr 17 07:57:40.854670 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:57:40.854637 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8a25fee9_99e3_4187_b81d_e2f9802f42d2.slice/crio-df965b8613c47cd47b502f80bf6167e9a82cad4e4607cc6143ae044a70c26212 WatchSource:0}: Error finding container df965b8613c47cd47b502f80bf6167e9a82cad4e4607cc6143ae044a70c26212: Status 404 returned error can't find the container with id df965b8613c47cd47b502f80bf6167e9a82cad4e4607cc6143ae044a70c26212 Apr 17 07:57:41.295011 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:41.294977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" event={"ID":"8a25fee9-99e3-4187-b81d-e2f9802f42d2","Type":"ContainerStarted","Data":"df965b8613c47cd47b502f80bf6167e9a82cad4e4607cc6143ae044a70c26212"} Apr 17 07:57:44.303483 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:44.303441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" event={"ID":"8a25fee9-99e3-4187-b81d-e2f9802f42d2","Type":"ContainerStarted","Data":"7cb1b9ed660b32f095d3c4192f36517a2a6b535835babc5205bac662b98378be"} Apr 17 07:57:44.303952 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:44.303652 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:57:44.304807 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:44.304778 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-c29wq" event={"ID":"d18f0e34-2810-43cf-b2b3-e14387d086b9","Type":"ContainerStarted","Data":"c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca"} Apr 17 07:57:44.304916 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:44.304892 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:57:44.321760 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:44.321716 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" podStartSLOduration=2.298498184 podStartE2EDuration="5.321701237s" podCreationTimestamp="2026-04-17 07:57:39 +0000 UTC" firstStartedPulling="2026-04-17 07:57:40.856297529 +0000 UTC m=+388.199548969" lastFinishedPulling="2026-04-17 07:57:43.87950058 +0000 UTC m=+391.222752022" observedRunningTime="2026-04-17 07:57:44.320881194 +0000 UTC m=+391.664132657" watchObservedRunningTime="2026-04-17 07:57:44.321701237 +0000 UTC m=+391.664952679" Apr 17 07:57:44.336956 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:57:44.336913 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-c29wq" podStartSLOduration=2.45024295 podStartE2EDuration="5.336901364s" podCreationTimestamp="2026-04-17 07:57:39 +0000 UTC" firstStartedPulling="2026-04-17 07:57:40.222195634 +0000 UTC m=+387.565447075" lastFinishedPulling="2026-04-17 07:57:43.108854045 +0000 UTC m=+390.452105489" observedRunningTime="2026-04-17 07:57:44.335870358 +0000 UTC m=+391.679121822" watchObservedRunningTime="2026-04-17 07:57:44.336901364 +0000 UTC m=+391.680152909" Apr 17 07:58:15.310362 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:15.310311 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-phzzw" Apr 17 07:58:15.313748 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:15.313722 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:58:16.469430 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.469367 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-c29wq"] Apr 17 07:58:16.469939 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.469687 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-558564fd68-c29wq" podUID="d18f0e34-2810-43cf-b2b3-e14387d086b9" containerName="manager" containerID="cri-o://c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca" gracePeriod=10 Apr 17 07:58:16.490762 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.490736 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-rwgw4"] Apr 17 07:58:16.493907 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.493884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.500469 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.500444 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-rwgw4"] Apr 17 07:58:16.662978 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.662948 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2b6b903-ecdb-4bd5-be31-5c280499ade5-cert\") pod \"kserve-controller-manager-558564fd68-rwgw4\" (UID: \"b2b6b903-ecdb-4bd5-be31-5c280499ade5\") " pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.663082 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.662992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vslfr\" (UniqueName: \"kubernetes.io/projected/b2b6b903-ecdb-4bd5-be31-5c280499ade5-kube-api-access-vslfr\") pod \"kserve-controller-manager-558564fd68-rwgw4\" (UID: \"b2b6b903-ecdb-4bd5-be31-5c280499ade5\") " pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.706063 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.706043 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:58:16.764236 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.764175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vslfr\" (UniqueName: \"kubernetes.io/projected/b2b6b903-ecdb-4bd5-be31-5c280499ade5-kube-api-access-vslfr\") pod \"kserve-controller-manager-558564fd68-rwgw4\" (UID: \"b2b6b903-ecdb-4bd5-be31-5c280499ade5\") " pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.764337 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.764244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2b6b903-ecdb-4bd5-be31-5c280499ade5-cert\") pod \"kserve-controller-manager-558564fd68-rwgw4\" (UID: \"b2b6b903-ecdb-4bd5-be31-5c280499ade5\") " pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.766686 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.766666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2b6b903-ecdb-4bd5-be31-5c280499ade5-cert\") pod \"kserve-controller-manager-558564fd68-rwgw4\" (UID: \"b2b6b903-ecdb-4bd5-be31-5c280499ade5\") " pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.772048 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.772027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vslfr\" (UniqueName: \"kubernetes.io/projected/b2b6b903-ecdb-4bd5-be31-5c280499ade5-kube-api-access-vslfr\") pod \"kserve-controller-manager-558564fd68-rwgw4\" (UID: \"b2b6b903-ecdb-4bd5-be31-5c280499ade5\") " pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.856868 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.856842 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:16.864628 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.864606 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d18f0e34-2810-43cf-b2b3-e14387d086b9-cert\") pod \"d18f0e34-2810-43cf-b2b3-e14387d086b9\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " Apr 17 07:58:16.864710 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.864651 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mft57\" (UniqueName: \"kubernetes.io/projected/d18f0e34-2810-43cf-b2b3-e14387d086b9-kube-api-access-mft57\") pod \"d18f0e34-2810-43cf-b2b3-e14387d086b9\" (UID: \"d18f0e34-2810-43cf-b2b3-e14387d086b9\") " Apr 17 07:58:16.866722 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.866691 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18f0e34-2810-43cf-b2b3-e14387d086b9-cert" (OuterVolumeSpecName: "cert") pod "d18f0e34-2810-43cf-b2b3-e14387d086b9" (UID: "d18f0e34-2810-43cf-b2b3-e14387d086b9"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:58:16.866889 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.866874 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18f0e34-2810-43cf-b2b3-e14387d086b9-kube-api-access-mft57" (OuterVolumeSpecName: "kube-api-access-mft57") pod "d18f0e34-2810-43cf-b2b3-e14387d086b9" (UID: "d18f0e34-2810-43cf-b2b3-e14387d086b9"). InnerVolumeSpecName "kube-api-access-mft57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:58:16.965851 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.965815 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d18f0e34-2810-43cf-b2b3-e14387d086b9-cert\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:58:16.965851 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.965848 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mft57\" (UniqueName: \"kubernetes.io/projected/d18f0e34-2810-43cf-b2b3-e14387d086b9-kube-api-access-mft57\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 07:58:16.967829 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:16.967805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-rwgw4"] Apr 17 07:58:16.970752 ip-10-0-138-143 kubenswrapper[2573]: W0417 07:58:16.970728 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2b6b903_ecdb_4bd5_be31_5c280499ade5.slice/crio-174d7aeb58b56cb70668dd2bb632879ea47e4a76e16f73d3ccab6e716788a8fa WatchSource:0}: Error finding container 174d7aeb58b56cb70668dd2bb632879ea47e4a76e16f73d3ccab6e716788a8fa: Status 404 returned error can't find the container with id 174d7aeb58b56cb70668dd2bb632879ea47e4a76e16f73d3ccab6e716788a8fa Apr 17 07:58:17.390432 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.390403 2573 generic.go:358] "Generic (PLEG): container finished" podID="d18f0e34-2810-43cf-b2b3-e14387d086b9" containerID="c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca" exitCode=0 Apr 17 07:58:17.390541 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.390470 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-c29wq" Apr 17 07:58:17.390541 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.390485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-c29wq" event={"ID":"d18f0e34-2810-43cf-b2b3-e14387d086b9","Type":"ContainerDied","Data":"c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca"} Apr 17 07:58:17.390541 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.390531 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-c29wq" event={"ID":"d18f0e34-2810-43cf-b2b3-e14387d086b9","Type":"ContainerDied","Data":"a367ac59570df6b6523164571a6295a9c44df1add8f28b20a5688bf371bb0036"} Apr 17 07:58:17.390702 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.390550 2573 scope.go:117] "RemoveContainer" containerID="c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca" Apr 17 07:58:17.391450 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.391429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" event={"ID":"b2b6b903-ecdb-4bd5-be31-5c280499ade5","Type":"ContainerStarted","Data":"174d7aeb58b56cb70668dd2bb632879ea47e4a76e16f73d3ccab6e716788a8fa"} Apr 17 07:58:17.403330 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.403315 2573 scope.go:117] "RemoveContainer" containerID="c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca" Apr 17 07:58:17.403610 ip-10-0-138-143 kubenswrapper[2573]: E0417 07:58:17.403582 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca\": container with ID starting with c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca not found: ID does not exist" containerID="c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca" Apr 17 07:58:17.403682 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.403618 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca"} err="failed to get container status \"c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca\": rpc error: code = NotFound desc = could not find container \"c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca\": container with ID starting with c85e1cc9536bd6c1074221a3a9ab83520f19d268c412a28c83ca31c6abe068ca not found: ID does not exist" Apr 17 07:58:17.411806 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.411778 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-c29wq"] Apr 17 07:58:17.415486 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:17.415469 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-c29wq"] Apr 17 07:58:18.396477 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:18.396436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" event={"ID":"b2b6b903-ecdb-4bd5-be31-5c280499ade5","Type":"ContainerStarted","Data":"a932ceeb572977b38322b9d4de86cc2b22b95d6ea511ba0c2b6ccf0c882c446e"} Apr 17 07:58:18.396884 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:18.396582 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 07:58:18.411214 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:18.411166 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" podStartSLOduration=2.037050666 podStartE2EDuration="2.411153007s" podCreationTimestamp="2026-04-17 07:58:16 +0000 UTC" firstStartedPulling="2026-04-17 07:58:16.972080254 +0000 UTC m=+424.315331695" lastFinishedPulling="2026-04-17 07:58:17.346182593 +0000 UTC m=+424.689434036" observedRunningTime="2026-04-17 07:58:18.41052353 +0000 UTC m=+425.753774991" watchObservedRunningTime="2026-04-17 07:58:18.411153007 +0000 UTC m=+425.754404531" Apr 17 07:58:19.163864 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:19.163825 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18f0e34-2810-43cf-b2b3-e14387d086b9" path="/var/lib/kubelet/pods/d18f0e34-2810-43cf-b2b3-e14387d086b9/volumes" Apr 17 07:58:49.404504 ip-10-0-138-143 kubenswrapper[2573]: I0417 07:58:49.404420 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-rwgw4" Apr 17 08:00:17.988842 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:17.988763 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9"] Apr 17 08:00:17.989230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:17.988987 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d18f0e34-2810-43cf-b2b3-e14387d086b9" containerName="manager" Apr 17 08:00:17.989230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:17.988997 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18f0e34-2810-43cf-b2b3-e14387d086b9" containerName="manager" Apr 17 08:00:17.989230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:17.989048 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d18f0e34-2810-43cf-b2b3-e14387d086b9" containerName="manager" Apr 17 08:00:17.991934 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:17.991916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:17.994083 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:17.994065 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9pcqp\"" Apr 17 08:00:18.000676 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.000654 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9"] Apr 17 08:00:18.060944 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.060922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e2e5212-2274-425a-b9bb-1a34d5e53317-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9\" (UID: \"1e2e5212-2274-425a-b9bb-1a34d5e53317\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:18.161674 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.161645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e2e5212-2274-425a-b9bb-1a34d5e53317-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9\" (UID: \"1e2e5212-2274-425a-b9bb-1a34d5e53317\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:18.161969 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.161953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e2e5212-2274-425a-b9bb-1a34d5e53317-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9\" (UID: \"1e2e5212-2274-425a-b9bb-1a34d5e53317\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:18.302708 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.302624 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:18.415721 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.415680 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9"] Apr 17 08:00:18.419367 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:00:18.419340 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2e5212_2274_425a_b9bb_1a34d5e53317.slice/crio-4b945754cd45d521ebd59726e01d39a94eee5b3eed9585d4ebdaf785d4fba886 WatchSource:0}: Error finding container 4b945754cd45d521ebd59726e01d39a94eee5b3eed9585d4ebdaf785d4fba886: Status 404 returned error can't find the container with id 4b945754cd45d521ebd59726e01d39a94eee5b3eed9585d4ebdaf785d4fba886 Apr 17 08:00:18.710514 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:18.710487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerStarted","Data":"4b945754cd45d521ebd59726e01d39a94eee5b3eed9585d4ebdaf785d4fba886"} Apr 17 08:00:23.727555 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:23.727519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerStarted","Data":"1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12"} Apr 17 08:00:26.736421 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:26.736364 2573 generic.go:358] "Generic (PLEG): container finished" podID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerID="1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12" exitCode=0 Apr 17 08:00:26.736754 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:26.736418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerDied","Data":"1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12"} Apr 17 08:00:40.781018 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:40.780977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerStarted","Data":"a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5"} Apr 17 08:00:43.790539 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:43.790497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerStarted","Data":"8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7"} Apr 17 08:00:43.790892 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:43.790710 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:43.791728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:43.791704 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:00:43.808755 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:43.808706 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podStartSLOduration=2.134046228 podStartE2EDuration="26.808690876s" podCreationTimestamp="2026-04-17 08:00:17 +0000 UTC" firstStartedPulling="2026-04-17 08:00:18.421519193 +0000 UTC m=+545.764770647" lastFinishedPulling="2026-04-17 08:00:43.096163846 +0000 UTC m=+570.439415295" observedRunningTime="2026-04-17 08:00:43.807675647 +0000 UTC m=+571.150927103" watchObservedRunningTime="2026-04-17 08:00:43.808690876 +0000 UTC m=+571.151942341" Apr 17 08:00:44.793772 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:44.793742 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:00:44.794129 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:44.793842 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:00:44.794699 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:44.794671 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:00:45.796077 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:45.796032 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:00:45.796512 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:45.796408 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:00:55.796332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:55.796267 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:00:55.796823 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:00:55.796702 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:01:05.795980 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:05.795931 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:01:05.796399 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:05.796322 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:01:15.796645 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:15.796587 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:01:15.797015 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:15.796991 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:01:25.796642 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:25.796596 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:01:25.797153 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:25.797018 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:01:35.796704 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:35.796602 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:01:35.797151 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:35.797130 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:01:45.797512 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:45.797474 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:01:45.797939 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:45.797597 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:01:53.204782 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.204748 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9"] Apr 17 08:01:53.206196 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.205018 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" containerID="cri-o://a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5" gracePeriod=30 Apr 17 08:01:53.206196 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.205118 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" containerID="cri-o://8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7" gracePeriod=30 Apr 17 08:01:53.308204 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.308173 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz"] Apr 17 08:01:53.310452 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.310430 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:01:53.319962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.319934 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz"] Apr 17 08:01:53.400651 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.400616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8b6230a-e750-498a-a307-2a68a0fd7af2-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz\" (UID: \"c8b6230a-e750-498a-a307-2a68a0fd7af2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:01:53.501824 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.501764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8b6230a-e750-498a-a307-2a68a0fd7af2-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz\" (UID: \"c8b6230a-e750-498a-a307-2a68a0fd7af2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:01:53.502086 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.502069 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8b6230a-e750-498a-a307-2a68a0fd7af2-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz\" (UID: \"c8b6230a-e750-498a-a307-2a68a0fd7af2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:01:53.621309 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.621278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:01:53.740363 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.740320 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz"] Apr 17 08:01:53.746084 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:01:53.746055 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b6230a_e750_498a_a307_2a68a0fd7af2.slice/crio-33472fc29951a2d0368e11721e338e1c0b608fbb2879e14732749ad672852baf WatchSource:0}: Error finding container 33472fc29951a2d0368e11721e338e1c0b608fbb2879e14732749ad672852baf: Status 404 returned error can't find the container with id 33472fc29951a2d0368e11721e338e1c0b608fbb2879e14732749ad672852baf Apr 17 08:01:53.979136 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.979094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerStarted","Data":"7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b"} Apr 17 08:01:53.979136 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:53.979129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerStarted","Data":"33472fc29951a2d0368e11721e338e1c0b608fbb2879e14732749ad672852baf"} Apr 17 08:01:55.796567 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:55.796523 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:01:55.796986 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:55.796770 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:01:56.988799 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:56.988770 2573 generic.go:358] "Generic (PLEG): container finished" podID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerID="a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5" exitCode=0 Apr 17 08:01:56.989104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:56.988814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerDied","Data":"a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5"} Apr 17 08:01:57.993635 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:57.993597 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerID="7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b" exitCode=0 Apr 17 08:01:57.994041 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:57.993668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerDied","Data":"7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b"} Apr 17 08:01:58.999104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:58.999071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerStarted","Data":"cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0"} Apr 17 08:01:58.999104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:58.999109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerStarted","Data":"f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8"} Apr 17 08:01:58.999526 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:58.999414 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:01:59.000527 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:59.000495 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:01:59.015428 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:01:59.015391 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podStartSLOduration=6.015363807 podStartE2EDuration="6.015363807s" podCreationTimestamp="2026-04-17 08:01:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:01:59.014313885 +0000 UTC m=+646.357565348" watchObservedRunningTime="2026-04-17 08:01:59.015363807 +0000 UTC m=+646.358615271" Apr 17 08:02:00.001984 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:00.001954 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:02:00.002407 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:00.002069 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:00.003019 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:00.002994 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:01.005020 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:01.004971 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:01.005423 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:01.005315 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:05.796645 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:05.796599 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:02:05.797077 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:05.796948 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:11.005735 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:11.005684 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:11.006191 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:11.006170 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:15.796976 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:15.796919 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 17 08:02:15.797445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:15.797089 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:02:15.797445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:15.797261 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:15.797445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:15.797354 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:02:21.005309 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:21.005262 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:21.005851 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:21.005825 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:23.356638 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:23.356616 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:02:23.499421 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:23.499330 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e2e5212-2274-425a-b9bb-1a34d5e53317-kserve-provision-location\") pod \"1e2e5212-2274-425a-b9bb-1a34d5e53317\" (UID: \"1e2e5212-2274-425a-b9bb-1a34d5e53317\") " Apr 17 08:02:23.499684 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:23.499654 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e2e5212-2274-425a-b9bb-1a34d5e53317-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1e2e5212-2274-425a-b9bb-1a34d5e53317" (UID: "1e2e5212-2274-425a-b9bb-1a34d5e53317"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:02:23.599914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:23.599890 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e2e5212-2274-425a-b9bb-1a34d5e53317-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:02:24.065615 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.065583 2573 generic.go:358] "Generic (PLEG): container finished" podID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerID="8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7" exitCode=0 Apr 17 08:02:24.065814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.065644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerDied","Data":"8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7"} Apr 17 08:02:24.065814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.065678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" event={"ID":"1e2e5212-2274-425a-b9bb-1a34d5e53317","Type":"ContainerDied","Data":"4b945754cd45d521ebd59726e01d39a94eee5b3eed9585d4ebdaf785d4fba886"} Apr 17 08:02:24.065814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.065683 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9" Apr 17 08:02:24.065814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.065698 2573 scope.go:117] "RemoveContainer" containerID="8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7" Apr 17 08:02:24.074306 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.074287 2573 scope.go:117] "RemoveContainer" containerID="a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5" Apr 17 08:02:24.081264 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.081245 2573 scope.go:117] "RemoveContainer" containerID="1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12" Apr 17 08:02:24.085986 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.085964 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9"] Apr 17 08:02:24.089443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.089423 2573 scope.go:117] "RemoveContainer" containerID="8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7" Apr 17 08:02:24.089687 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:02:24.089668 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7\": container with ID starting with 8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7 not found: ID does not exist" containerID="8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7" Apr 17 08:02:24.089756 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.089693 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7"} err="failed to get container status \"8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7\": rpc error: code = NotFound desc = could not find container \"8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7\": container with ID starting with 8f81af3f942740a07758498bfa382c7befe1eab0264a39084548491aa12c28d7 not found: ID does not exist" Apr 17 08:02:24.089756 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.089710 2573 scope.go:117] "RemoveContainer" containerID="a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5" Apr 17 08:02:24.089912 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.089894 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-r6mt9"] Apr 17 08:02:24.089949 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:02:24.089897 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5\": container with ID starting with a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5 not found: ID does not exist" containerID="a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5" Apr 17 08:02:24.089983 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.089947 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5"} err="failed to get container status \"a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5\": rpc error: code = NotFound desc = could not find container \"a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5\": container with ID starting with a6b2cb41b04767976c3cf2ec8adf4595968b398d3f35be52465d3948322d73f5 not found: ID does not exist" Apr 17 08:02:24.089983 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.089963 2573 scope.go:117] "RemoveContainer" containerID="1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12" Apr 17 08:02:24.090212 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:02:24.090194 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12\": container with ID starting with 1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12 not found: ID does not exist" containerID="1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12" Apr 17 08:02:24.090257 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:24.090217 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12"} err="failed to get container status \"1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12\": rpc error: code = NotFound desc = could not find container \"1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12\": container with ID starting with 1c857e71c7e86c4b349cb147a335577ed04a67be3fad14430abdc2eae2b7bb12 not found: ID does not exist" Apr 17 08:02:25.163450 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:25.163419 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" path="/var/lib/kubelet/pods/1e2e5212-2274-425a-b9bb-1a34d5e53317/volumes" Apr 17 08:02:31.005221 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:31.005176 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:31.005785 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:31.005762 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:41.005579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:41.005529 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:41.005987 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:41.005960 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:02:51.005831 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:51.005777 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:02:51.006365 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:02:51.006243 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:01.005607 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:01.005569 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:03:01.006150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:01.005633 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:03:08.367782 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:08.367689 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz"] Apr 17 08:03:08.368272 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:08.368078 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" containerID="cri-o://f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8" gracePeriod=30 Apr 17 08:03:08.368272 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:08.368156 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" containerID="cri-o://cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0" gracePeriod=30 Apr 17 08:03:11.005342 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:11.005288 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:03:11.005807 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:11.005660 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:12.194394 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:12.194354 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerID="f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8" exitCode=0 Apr 17 08:03:12.194732 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:12.194410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerDied","Data":"f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8"} Apr 17 08:03:18.449926 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.449892 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7"] Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450156 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450167 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450177 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="storage-initializer" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450183 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="storage-initializer" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450193 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450200 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450245 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="agent" Apr 17 08:03:18.450283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.450253 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e2e5212-2274-425a-b9bb-1a34d5e53317" containerName="kserve-container" Apr 17 08:03:18.454370 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.454353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:18.462480 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.462457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7"] Apr 17 08:03:18.557456 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.557432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63f37bb9-3612-4466-8768-fd72e2ebc75e-kserve-provision-location\") pod \"isvc-logger-predictor-78cf4b7dc8-p5gc7\" (UID: \"63f37bb9-3612-4466-8768-fd72e2ebc75e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:18.657893 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.657868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63f37bb9-3612-4466-8768-fd72e2ebc75e-kserve-provision-location\") pod \"isvc-logger-predictor-78cf4b7dc8-p5gc7\" (UID: \"63f37bb9-3612-4466-8768-fd72e2ebc75e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:18.658177 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.658162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63f37bb9-3612-4466-8768-fd72e2ebc75e-kserve-provision-location\") pod \"isvc-logger-predictor-78cf4b7dc8-p5gc7\" (UID: \"63f37bb9-3612-4466-8768-fd72e2ebc75e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:18.765535 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.765466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:18.878482 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.878449 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7"] Apr 17 08:03:18.881606 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:03:18.881569 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f37bb9_3612_4466_8768_fd72e2ebc75e.slice/crio-6909b4579279a4e919e9ed28094ec89946c1ac4dd8263b6a1edc9e4fce84d816 WatchSource:0}: Error finding container 6909b4579279a4e919e9ed28094ec89946c1ac4dd8263b6a1edc9e4fce84d816: Status 404 returned error can't find the container with id 6909b4579279a4e919e9ed28094ec89946c1ac4dd8263b6a1edc9e4fce84d816 Apr 17 08:03:18.883433 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:18.883414 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:03:19.213498 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:19.213458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerStarted","Data":"1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987"} Apr 17 08:03:19.213498 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:19.213491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerStarted","Data":"6909b4579279a4e919e9ed28094ec89946c1ac4dd8263b6a1edc9e4fce84d816"} Apr 17 08:03:21.005323 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:21.005271 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:03:21.005804 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:21.005604 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:23.225575 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:23.225538 2573 generic.go:358] "Generic (PLEG): container finished" podID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerID="1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987" exitCode=0 Apr 17 08:03:23.226013 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:23.225623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerDied","Data":"1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987"} Apr 17 08:03:24.230339 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.230300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerStarted","Data":"19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80"} Apr 17 08:03:24.230339 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.230340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerStarted","Data":"b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c"} Apr 17 08:03:24.230828 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.230785 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:24.230828 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.230815 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:03:24.231971 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.231945 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:03:24.232576 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.232552 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:24.246737 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:24.246685 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podStartSLOduration=6.246673861 podStartE2EDuration="6.246673861s" podCreationTimestamp="2026-04-17 08:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:03:24.245548973 +0000 UTC m=+731.588800436" watchObservedRunningTime="2026-04-17 08:03:24.246673861 +0000 UTC m=+731.589925324" Apr 17 08:03:25.233371 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:25.233336 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:03:25.233767 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:25.233664 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:31.005420 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:31.005341 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:5000: connect: connection refused" Apr 17 08:03:31.005880 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:31.005572 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:03:31.005880 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:31.005662 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:31.005880 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:31.005770 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:03:35.233659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:35.233615 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:03:35.234087 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:35.234029 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:38.506158 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:38.506136 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:03:38.691098 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:38.691074 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8b6230a-e750-498a-a307-2a68a0fd7af2-kserve-provision-location\") pod \"c8b6230a-e750-498a-a307-2a68a0fd7af2\" (UID: \"c8b6230a-e750-498a-a307-2a68a0fd7af2\") " Apr 17 08:03:38.691364 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:38.691340 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b6230a-e750-498a-a307-2a68a0fd7af2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c8b6230a-e750-498a-a307-2a68a0fd7af2" (UID: "c8b6230a-e750-498a-a307-2a68a0fd7af2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:03:38.791948 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:38.791919 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8b6230a-e750-498a-a307-2a68a0fd7af2-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:03:39.273460 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.273433 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerID="cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0" exitCode=0 Apr 17 08:03:39.273614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.273468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerDied","Data":"cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0"} Apr 17 08:03:39.273614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.273489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" event={"ID":"c8b6230a-e750-498a-a307-2a68a0fd7af2","Type":"ContainerDied","Data":"33472fc29951a2d0368e11721e338e1c0b608fbb2879e14732749ad672852baf"} Apr 17 08:03:39.273614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.273506 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz" Apr 17 08:03:39.273614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.273508 2573 scope.go:117] "RemoveContainer" containerID="cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0" Apr 17 08:03:39.281137 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.281120 2573 scope.go:117] "RemoveContainer" containerID="f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8" Apr 17 08:03:39.287686 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.287670 2573 scope.go:117] "RemoveContainer" containerID="7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b" Apr 17 08:03:39.289701 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.289684 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz"] Apr 17 08:03:39.294844 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.294826 2573 scope.go:117] "RemoveContainer" containerID="cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0" Apr 17 08:03:39.294945 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.294904 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-8gpmz"] Apr 17 08:03:39.295126 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:03:39.295108 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0\": container with ID starting with cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0 not found: ID does not exist" containerID="cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0" Apr 17 08:03:39.295176 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.295134 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0"} err="failed to get container status \"cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0\": rpc error: code = NotFound desc = could not find container \"cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0\": container with ID starting with cd81a242fb92efce17c247c9c6065076ae7c3c2c41b3166ce45ecec7211a94b0 not found: ID does not exist" Apr 17 08:03:39.295176 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.295151 2573 scope.go:117] "RemoveContainer" containerID="f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8" Apr 17 08:03:39.295644 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:03:39.295373 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8\": container with ID starting with f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8 not found: ID does not exist" containerID="f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8" Apr 17 08:03:39.295694 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.295651 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8"} err="failed to get container status \"f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8\": rpc error: code = NotFound desc = could not find container \"f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8\": container with ID starting with f2d30f703bc672c3a3679033584ee644f45e23a3549a403423d28dde776436d8 not found: ID does not exist" Apr 17 08:03:39.295694 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.295667 2573 scope.go:117] "RemoveContainer" containerID="7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b" Apr 17 08:03:39.295842 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:03:39.295827 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b\": container with ID starting with 7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b not found: ID does not exist" containerID="7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b" Apr 17 08:03:39.295882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:39.295845 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b"} err="failed to get container status \"7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b\": rpc error: code = NotFound desc = could not find container \"7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b\": container with ID starting with 7f6de558c72b97a47fe7f4176163768eeb6d4ccb26e624b262d7febf4c28e19b not found: ID does not exist" Apr 17 08:03:41.163686 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:41.163651 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" path="/var/lib/kubelet/pods/c8b6230a-e750-498a-a307-2a68a0fd7af2/volumes" Apr 17 08:03:45.233581 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:45.233532 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:03:45.234113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:45.234086 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:55.233334 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:55.233286 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:03:55.233858 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:03:55.233703 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:05.233507 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:05.233463 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:04:05.233953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:05.233815 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:15.233648 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:15.233604 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:04:15.234121 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:15.234096 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:25.234559 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:25.234519 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:04:25.235044 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:25.234797 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:04:33.676662 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.676573 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7"] Apr 17 08:04:33.677185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.676934 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" containerID="cri-o://b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c" gracePeriod=30 Apr 17 08:04:33.677185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.676934 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" containerID="cri-o://19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80" gracePeriod=30 Apr 17 08:04:33.714437 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714412 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v"] Apr 17 08:04:33.714669 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714656 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" Apr 17 08:04:33.714711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714672 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" Apr 17 08:04:33.714711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714686 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" Apr 17 08:04:33.714711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714692 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" Apr 17 08:04:33.714711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714698 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="storage-initializer" Apr 17 08:04:33.714711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714704 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="storage-initializer" Apr 17 08:04:33.714859 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714747 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="agent" Apr 17 08:04:33.714859 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.714758 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8b6230a-e750-498a-a307-2a68a0fd7af2" containerName="kserve-container" Apr 17 08:04:33.717297 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.717279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:04:33.725549 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.725521 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v"] Apr 17 08:04:33.744467 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.744442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cea63c6-47ec-4fae-8408-15c9e827a89b-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-89s6v\" (UID: \"0cea63c6-47ec-4fae-8408-15c9e827a89b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:04:33.845568 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.845537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cea63c6-47ec-4fae-8408-15c9e827a89b-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-89s6v\" (UID: \"0cea63c6-47ec-4fae-8408-15c9e827a89b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:04:33.845922 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:33.845902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cea63c6-47ec-4fae-8408-15c9e827a89b-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-89s6v\" (UID: \"0cea63c6-47ec-4fae-8408-15c9e827a89b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:04:34.028043 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:34.028010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:04:34.141567 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:34.141534 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v"] Apr 17 08:04:34.143956 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:04:34.143928 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cea63c6_47ec_4fae_8408_15c9e827a89b.slice/crio-dd56301a6ec686380aa33c7cda2cb0ac33f222d44be7eeba45aecf5fad6e4604 WatchSource:0}: Error finding container dd56301a6ec686380aa33c7cda2cb0ac33f222d44be7eeba45aecf5fad6e4604: Status 404 returned error can't find the container with id dd56301a6ec686380aa33c7cda2cb0ac33f222d44be7eeba45aecf5fad6e4604 Apr 17 08:04:34.417541 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:34.417464 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" event={"ID":"0cea63c6-47ec-4fae-8408-15c9e827a89b","Type":"ContainerStarted","Data":"5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e"} Apr 17 08:04:34.417541 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:34.417498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" event={"ID":"0cea63c6-47ec-4fae-8408-15c9e827a89b","Type":"ContainerStarted","Data":"dd56301a6ec686380aa33c7cda2cb0ac33f222d44be7eeba45aecf5fad6e4604"} Apr 17 08:04:35.233368 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:35.233325 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:04:35.233777 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:35.233705 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:37.426202 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:37.426172 2573 generic.go:358] "Generic (PLEG): container finished" podID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerID="b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c" exitCode=0 Apr 17 08:04:37.426507 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:37.426220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerDied","Data":"b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c"} Apr 17 08:04:38.430406 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:38.430345 2573 generic.go:358] "Generic (PLEG): container finished" podID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerID="5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e" exitCode=0 Apr 17 08:04:38.430843 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:38.430409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" event={"ID":"0cea63c6-47ec-4fae-8408-15c9e827a89b","Type":"ContainerDied","Data":"5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e"} Apr 17 08:04:45.233319 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:45.233278 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:04:45.233791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:45.233628 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:46.454918 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:46.454879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" event={"ID":"0cea63c6-47ec-4fae-8408-15c9e827a89b","Type":"ContainerStarted","Data":"9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e"} Apr 17 08:04:46.455412 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:46.455224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:04:46.456589 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:46.456560 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:04:46.470872 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:46.470829 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podStartSLOduration=6.398420994 podStartE2EDuration="13.470815437s" podCreationTimestamp="2026-04-17 08:04:33 +0000 UTC" firstStartedPulling="2026-04-17 08:04:38.431756991 +0000 UTC m=+805.775008432" lastFinishedPulling="2026-04-17 08:04:45.50415143 +0000 UTC m=+812.847402875" observedRunningTime="2026-04-17 08:04:46.469475023 +0000 UTC m=+813.812726487" watchObservedRunningTime="2026-04-17 08:04:46.470815437 +0000 UTC m=+813.814066900" Apr 17 08:04:47.457463 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:47.457427 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:04:55.234107 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:55.234063 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 17 08:04:55.234569 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:55.234190 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:04:55.234569 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:55.234366 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:55.234569 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:55.234501 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:04:57.457855 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:04:57.457815 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:05:03.861829 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:03.861806 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:05:03.958026 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:03.957995 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63f37bb9-3612-4466-8768-fd72e2ebc75e-kserve-provision-location\") pod \"63f37bb9-3612-4466-8768-fd72e2ebc75e\" (UID: \"63f37bb9-3612-4466-8768-fd72e2ebc75e\") " Apr 17 08:05:03.958303 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:03.958281 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f37bb9-3612-4466-8768-fd72e2ebc75e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "63f37bb9-3612-4466-8768-fd72e2ebc75e" (UID: "63f37bb9-3612-4466-8768-fd72e2ebc75e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:04.059033 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.059002 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63f37bb9-3612-4466-8768-fd72e2ebc75e-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:05:04.500772 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.500727 2573 generic.go:358] "Generic (PLEG): container finished" podID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerID="19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80" exitCode=137 Apr 17 08:05:04.500943 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.500813 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" Apr 17 08:05:04.500943 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.500806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerDied","Data":"19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80"} Apr 17 08:05:04.500943 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.500915 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7" event={"ID":"63f37bb9-3612-4466-8768-fd72e2ebc75e","Type":"ContainerDied","Data":"6909b4579279a4e919e9ed28094ec89946c1ac4dd8263b6a1edc9e4fce84d816"} Apr 17 08:05:04.500943 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.500934 2573 scope.go:117] "RemoveContainer" containerID="19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80" Apr 17 08:05:04.508831 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.508812 2573 scope.go:117] "RemoveContainer" containerID="b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c" Apr 17 08:05:04.515614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.515594 2573 scope.go:117] "RemoveContainer" containerID="1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987" Apr 17 08:05:04.519246 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.519223 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7"] Apr 17 08:05:04.521715 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.521693 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-p5gc7"] Apr 17 08:05:04.523648 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.523633 2573 scope.go:117] "RemoveContainer" containerID="19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80" Apr 17 08:05:04.523888 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:05:04.523868 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80\": container with ID starting with 19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80 not found: ID does not exist" containerID="19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80" Apr 17 08:05:04.523956 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.523901 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80"} err="failed to get container status \"19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80\": rpc error: code = NotFound desc = could not find container \"19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80\": container with ID starting with 19fa02f0af2b6d1a801f885e2509e9ed1d8123fb365ed7b917bf302856f32b80 not found: ID does not exist" Apr 17 08:05:04.523956 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.523925 2573 scope.go:117] "RemoveContainer" containerID="b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c" Apr 17 08:05:04.524131 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:05:04.524112 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c\": container with ID starting with b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c not found: ID does not exist" containerID="b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c" Apr 17 08:05:04.524171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.524139 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c"} err="failed to get container status \"b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c\": rpc error: code = NotFound desc = could not find container \"b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c\": container with ID starting with b7fdd792a180101e98c0405d70fe53fac4cdb007aa787502223f7f67326b259c not found: ID does not exist" Apr 17 08:05:04.524171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.524156 2573 scope.go:117] "RemoveContainer" containerID="1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987" Apr 17 08:05:04.524339 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:05:04.524324 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987\": container with ID starting with 1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987 not found: ID does not exist" containerID="1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987" Apr 17 08:05:04.524410 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:04.524342 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987"} err="failed to get container status \"1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987\": rpc error: code = NotFound desc = could not find container \"1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987\": container with ID starting with 1e88d3de42f0c3c9ba6cb0cae6060b7ca4e1f753993079ecf2330d8934f60987 not found: ID does not exist" Apr 17 08:05:05.163424 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:05.163368 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" path="/var/lib/kubelet/pods/63f37bb9-3612-4466-8768-fd72e2ebc75e/volumes" Apr 17 08:05:07.458221 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:07.458176 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:05:17.458231 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:17.458186 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:05:27.457949 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:27.457908 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:05:37.457835 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:37.457789 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:05:47.458081 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:47.458038 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:05:55.163771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:05:55.163737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:06:03.830213 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.830114 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v"] Apr 17 08:06:03.830676 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.830410 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" containerID="cri-o://9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e" gracePeriod=30 Apr 17 08:06:03.963659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963630 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99"] Apr 17 08:06:03.963901 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963889 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" Apr 17 08:06:03.963957 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963903 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" Apr 17 08:06:03.963957 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963913 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" Apr 17 08:06:03.963957 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963919 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" Apr 17 08:06:03.963957 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963929 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="storage-initializer" Apr 17 08:06:03.963957 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963935 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="storage-initializer" Apr 17 08:06:03.964128 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963977 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="agent" Apr 17 08:06:03.964128 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.963984 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="63f37bb9-3612-4466-8768-fd72e2ebc75e" containerName="kserve-container" Apr 17 08:06:03.965734 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.965719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:06:03.974649 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:03.974628 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99"] Apr 17 08:06:04.157785 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.157716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7351a26-0f4d-426c-956e-90509c21c498-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-hmr99\" (UID: \"a7351a26-0f4d-426c-956e-90509c21c498\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:06:04.258051 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.258014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7351a26-0f4d-426c-956e-90509c21c498-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-hmr99\" (UID: \"a7351a26-0f4d-426c-956e-90509c21c498\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:06:04.258479 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.258459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7351a26-0f4d-426c-956e-90509c21c498-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-hmr99\" (UID: \"a7351a26-0f4d-426c-956e-90509c21c498\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:06:04.275131 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.275106 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:06:04.389491 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.389462 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99"] Apr 17 08:06:04.393600 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:06:04.393572 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7351a26_0f4d_426c_956e_90509c21c498.slice/crio-1cb0313931b79c6caeb148e07f8e8423d81b2b5c3009dd772e17166acc9d0868 WatchSource:0}: Error finding container 1cb0313931b79c6caeb148e07f8e8423d81b2b5c3009dd772e17166acc9d0868: Status 404 returned error can't find the container with id 1cb0313931b79c6caeb148e07f8e8423d81b2b5c3009dd772e17166acc9d0868 Apr 17 08:06:04.658780 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.658742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" event={"ID":"a7351a26-0f4d-426c-956e-90509c21c498","Type":"ContainerStarted","Data":"20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431"} Apr 17 08:06:04.658780 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:04.658784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" event={"ID":"a7351a26-0f4d-426c-956e-90509c21c498","Type":"ContainerStarted","Data":"1cb0313931b79c6caeb148e07f8e8423d81b2b5c3009dd772e17166acc9d0868"} Apr 17 08:06:05.160864 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:05.160822 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 08:06:08.357645 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.357626 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:06:08.484401 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.484325 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cea63c6-47ec-4fae-8408-15c9e827a89b-kserve-provision-location\") pod \"0cea63c6-47ec-4fae-8408-15c9e827a89b\" (UID: \"0cea63c6-47ec-4fae-8408-15c9e827a89b\") " Apr 17 08:06:08.484650 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.484626 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cea63c6-47ec-4fae-8408-15c9e827a89b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0cea63c6-47ec-4fae-8408-15c9e827a89b" (UID: "0cea63c6-47ec-4fae-8408-15c9e827a89b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:08.585390 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.585361 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cea63c6-47ec-4fae-8408-15c9e827a89b-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:06:08.671016 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.670991 2573 generic.go:358] "Generic (PLEG): container finished" podID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerID="9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e" exitCode=0 Apr 17 08:06:08.671117 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.671066 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" Apr 17 08:06:08.671117 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.671086 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" event={"ID":"0cea63c6-47ec-4fae-8408-15c9e827a89b","Type":"ContainerDied","Data":"9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e"} Apr 17 08:06:08.671225 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.671129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v" event={"ID":"0cea63c6-47ec-4fae-8408-15c9e827a89b","Type":"ContainerDied","Data":"dd56301a6ec686380aa33c7cda2cb0ac33f222d44be7eeba45aecf5fad6e4604"} Apr 17 08:06:08.671225 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.671151 2573 scope.go:117] "RemoveContainer" containerID="9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e" Apr 17 08:06:08.672331 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.672309 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7351a26-0f4d-426c-956e-90509c21c498" containerID="20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431" exitCode=0 Apr 17 08:06:08.672443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.672352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" event={"ID":"a7351a26-0f4d-426c-956e-90509c21c498","Type":"ContainerDied","Data":"20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431"} Apr 17 08:06:08.679306 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.679285 2573 scope.go:117] "RemoveContainer" containerID="5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e" Apr 17 08:06:08.685839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.685753 2573 scope.go:117] "RemoveContainer" containerID="9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e" Apr 17 08:06:08.686036 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:06:08.686012 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e\": container with ID starting with 9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e not found: ID does not exist" containerID="9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e" Apr 17 08:06:08.686098 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.686043 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e"} err="failed to get container status \"9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e\": rpc error: code = NotFound desc = could not find container \"9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e\": container with ID starting with 9b094f76c0c32bc6de393951dea0f833bd7fdaa2c6ef354d8af43f95065eb88e not found: ID does not exist" Apr 17 08:06:08.686098 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.686060 2573 scope.go:117] "RemoveContainer" containerID="5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e" Apr 17 08:06:08.686302 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:06:08.686282 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e\": container with ID starting with 5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e not found: ID does not exist" containerID="5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e" Apr 17 08:06:08.686371 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.686318 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e"} err="failed to get container status \"5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e\": rpc error: code = NotFound desc = could not find container \"5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e\": container with ID starting with 5392739ca6353e5ce54e94db71da437373e89210737b62f6703578226ad25c2e not found: ID does not exist" Apr 17 08:06:08.706800 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.706779 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v"] Apr 17 08:06:08.710145 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:08.710117 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-89s6v"] Apr 17 08:06:09.163506 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:09.163476 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" path="/var/lib/kubelet/pods/0cea63c6-47ec-4fae-8408-15c9e827a89b/volumes" Apr 17 08:06:09.677197 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:09.677160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" event={"ID":"a7351a26-0f4d-426c-956e-90509c21c498","Type":"ContainerStarted","Data":"fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867"} Apr 17 08:06:09.677586 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:09.677487 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:06:09.678530 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:09.678505 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:06:09.692900 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:09.692862 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podStartSLOduration=6.692849998 podStartE2EDuration="6.692849998s" podCreationTimestamp="2026-04-17 08:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:06:09.691345314 +0000 UTC m=+897.034596776" watchObservedRunningTime="2026-04-17 08:06:09.692849998 +0000 UTC m=+897.036101460" Apr 17 08:06:10.681071 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:10.681030 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:06:20.681204 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:20.681160 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:06:30.682089 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:30.682048 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:06:40.682077 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:40.682029 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:06:50.681637 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:06:50.681579 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:07:00.681048 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:00.681009 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:07:10.681531 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:10.681490 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 08:07:20.682556 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:20.682522 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:07:24.527910 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.527876 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99"] Apr 17 08:07:24.528537 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.528248 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" containerID="cri-o://fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867" gracePeriod=30 Apr 17 08:07:24.588408 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.588357 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm"] Apr 17 08:07:24.588696 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.588678 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" Apr 17 08:07:24.588771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.588698 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" Apr 17 08:07:24.588771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.588728 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="storage-initializer" Apr 17 08:07:24.588771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.588737 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="storage-initializer" Apr 17 08:07:24.588998 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.588846 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cea63c6-47ec-4fae-8408-15c9e827a89b" containerName="kserve-container" Apr 17 08:07:24.591744 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.591725 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:07:24.599144 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.599122 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm"] Apr 17 08:07:24.686575 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.686552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5657a9b-ec66-4c97-9885-f540e67adb3a-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm\" (UID: \"b5657a9b-ec66-4c97-9885-f540e67adb3a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:07:24.787478 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.787416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5657a9b-ec66-4c97-9885-f540e67adb3a-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm\" (UID: \"b5657a9b-ec66-4c97-9885-f540e67adb3a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:07:24.787755 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.787737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5657a9b-ec66-4c97-9885-f540e67adb3a-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm\" (UID: \"b5657a9b-ec66-4c97-9885-f540e67adb3a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:07:24.902289 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:24.902270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:07:25.016908 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:25.016792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm"] Apr 17 08:07:25.019507 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:07:25.019476 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5657a9b_ec66_4c97_9885_f540e67adb3a.slice/crio-1e98d4a6f5c9eeb25ef74ca9f7b42d04cc5f8813855e442c9b6d34c668e5fd2c WatchSource:0}: Error finding container 1e98d4a6f5c9eeb25ef74ca9f7b42d04cc5f8813855e442c9b6d34c668e5fd2c: Status 404 returned error can't find the container with id 1e98d4a6f5c9eeb25ef74ca9f7b42d04cc5f8813855e442c9b6d34c668e5fd2c Apr 17 08:07:25.884004 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:25.883966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" event={"ID":"b5657a9b-ec66-4c97-9885-f540e67adb3a","Type":"ContainerStarted","Data":"801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195"} Apr 17 08:07:25.884004 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:25.884007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" event={"ID":"b5657a9b-ec66-4c97-9885-f540e67adb3a","Type":"ContainerStarted","Data":"1e98d4a6f5c9eeb25ef74ca9f7b42d04cc5f8813855e442c9b6d34c668e5fd2c"} Apr 17 08:07:29.057014 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.056994 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:07:29.118348 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.118318 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7351a26-0f4d-426c-956e-90509c21c498-kserve-provision-location\") pod \"a7351a26-0f4d-426c-956e-90509c21c498\" (UID: \"a7351a26-0f4d-426c-956e-90509c21c498\") " Apr 17 08:07:29.118641 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.118619 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7351a26-0f4d-426c-956e-90509c21c498-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7351a26-0f4d-426c-956e-90509c21c498" (UID: "a7351a26-0f4d-426c-956e-90509c21c498"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:29.219608 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.219584 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7351a26-0f4d-426c-956e-90509c21c498-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:07:29.894948 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.894914 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7351a26-0f4d-426c-956e-90509c21c498" containerID="fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867" exitCode=0 Apr 17 08:07:29.895147 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.895053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" event={"ID":"a7351a26-0f4d-426c-956e-90509c21c498","Type":"ContainerDied","Data":"fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867"} Apr 17 08:07:29.895147 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.895072 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" Apr 17 08:07:29.895147 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.895091 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99" event={"ID":"a7351a26-0f4d-426c-956e-90509c21c498","Type":"ContainerDied","Data":"1cb0313931b79c6caeb148e07f8e8423d81b2b5c3009dd772e17166acc9d0868"} Apr 17 08:07:29.895147 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.895111 2573 scope.go:117] "RemoveContainer" containerID="fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867" Apr 17 08:07:29.896644 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.896622 2573 generic.go:358] "Generic (PLEG): container finished" podID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerID="801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195" exitCode=0 Apr 17 08:07:29.896765 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.896663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" event={"ID":"b5657a9b-ec66-4c97-9885-f540e67adb3a","Type":"ContainerDied","Data":"801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195"} Apr 17 08:07:29.902665 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.902648 2573 scope.go:117] "RemoveContainer" containerID="20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431" Apr 17 08:07:29.909367 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.909351 2573 scope.go:117] "RemoveContainer" containerID="fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867" Apr 17 08:07:29.909621 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:07:29.909602 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867\": container with ID starting with fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867 not found: ID does not exist" containerID="fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867" Apr 17 08:07:29.909679 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.909627 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867"} err="failed to get container status \"fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867\": rpc error: code = NotFound desc = could not find container \"fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867\": container with ID starting with fc2515d8c7c50cfe6f3e3651228b5e1147d4ba8480db75edd58bb4d2bc11a867 not found: ID does not exist" Apr 17 08:07:29.909679 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.909643 2573 scope.go:117] "RemoveContainer" containerID="20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431" Apr 17 08:07:29.909840 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:07:29.909823 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431\": container with ID starting with 20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431 not found: ID does not exist" containerID="20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431" Apr 17 08:07:29.909880 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.909845 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431"} err="failed to get container status \"20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431\": rpc error: code = NotFound desc = could not find container \"20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431\": container with ID starting with 20f131dcef2a7eca07a4f0b09474b0e0d63a1ab1b575f73ed17113f839795431 not found: ID does not exist" Apr 17 08:07:29.913084 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.913064 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99"] Apr 17 08:07:29.918069 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:29.918049 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hmr99"] Apr 17 08:07:31.165431 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:07:31.165036 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7351a26-0f4d-426c-956e-90509c21c498" path="/var/lib/kubelet/pods/a7351a26-0f4d-426c-956e-90509c21c498/volumes" Apr 17 08:09:48.285642 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:09:48.285594 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" event={"ID":"b5657a9b-ec66-4c97-9885-f540e67adb3a","Type":"ContainerStarted","Data":"847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1"} Apr 17 08:09:48.286054 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:09:48.285772 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:09:48.312332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:09:48.309687 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" podStartSLOduration=6.910260263 podStartE2EDuration="2m24.309668037s" podCreationTimestamp="2026-04-17 08:07:24 +0000 UTC" firstStartedPulling="2026-04-17 08:07:29.897754406 +0000 UTC m=+977.241005847" lastFinishedPulling="2026-04-17 08:09:47.29716218 +0000 UTC m=+1114.640413621" observedRunningTime="2026-04-17 08:09:48.308067181 +0000 UTC m=+1115.651318645" watchObservedRunningTime="2026-04-17 08:09:48.309668037 +0000 UTC m=+1115.652919500" Apr 17 08:10:19.293138 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:19.293104 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:10:24.797646 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.797615 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm"] Apr 17 08:10:24.798117 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.797859 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerName="kserve-container" containerID="cri-o://847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1" gracePeriod=30 Apr 17 08:10:24.900300 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.900263 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv"] Apr 17 08:10:24.900543 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.900529 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="storage-initializer" Apr 17 08:10:24.900543 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.900544 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="storage-initializer" Apr 17 08:10:24.900664 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.900559 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" Apr 17 08:10:24.900664 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.900565 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" Apr 17 08:10:24.900664 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.900633 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7351a26-0f4d-426c-956e-90509c21c498" containerName="kserve-container" Apr 17 08:10:24.918296 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.918265 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv"] Apr 17 08:10:24.918461 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.918438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:24.976917 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:24.976885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb53e6c-fc3d-4a59-af39-8aea975cd714-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv\" (UID: \"beb53e6c-fc3d-4a59-af39-8aea975cd714\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:25.077775 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.077689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb53e6c-fc3d-4a59-af39-8aea975cd714-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv\" (UID: \"beb53e6c-fc3d-4a59-af39-8aea975cd714\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:25.078060 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.078042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb53e6c-fc3d-4a59-af39-8aea975cd714-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv\" (UID: \"beb53e6c-fc3d-4a59-af39-8aea975cd714\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:25.229615 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.229592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:25.349869 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.349685 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv"] Apr 17 08:10:25.352681 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:10:25.352654 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb53e6c_fc3d_4a59_af39_8aea975cd714.slice/crio-fc1d08debef419c4a4210ebf7571fb3aeb3e28fd10efb82ae84f6586aba7961f WatchSource:0}: Error finding container fc1d08debef419c4a4210ebf7571fb3aeb3e28fd10efb82ae84f6586aba7961f: Status 404 returned error can't find the container with id fc1d08debef419c4a4210ebf7571fb3aeb3e28fd10efb82ae84f6586aba7961f Apr 17 08:10:25.355294 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.355272 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:10:25.381932 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.381898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" event={"ID":"beb53e6c-fc3d-4a59-af39-8aea975cd714","Type":"ContainerStarted","Data":"fc1d08debef419c4a4210ebf7571fb3aeb3e28fd10efb82ae84f6586aba7961f"} Apr 17 08:10:25.754705 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.754684 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:10:25.783361 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.783333 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5657a9b-ec66-4c97-9885-f540e67adb3a-kserve-provision-location\") pod \"b5657a9b-ec66-4c97-9885-f540e67adb3a\" (UID: \"b5657a9b-ec66-4c97-9885-f540e67adb3a\") " Apr 17 08:10:25.783664 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.783644 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5657a9b-ec66-4c97-9885-f540e67adb3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5657a9b-ec66-4c97-9885-f540e67adb3a" (UID: "b5657a9b-ec66-4c97-9885-f540e67adb3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:25.884018 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:25.883963 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5657a9b-ec66-4c97-9885-f540e67adb3a-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:10:26.386084 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.386049 2573 generic.go:358] "Generic (PLEG): container finished" podID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerID="847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1" exitCode=0 Apr 17 08:10:26.386238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.386112 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" Apr 17 08:10:26.386238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.386139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" event={"ID":"b5657a9b-ec66-4c97-9885-f540e67adb3a","Type":"ContainerDied","Data":"847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1"} Apr 17 08:10:26.386238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.386179 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm" event={"ID":"b5657a9b-ec66-4c97-9885-f540e67adb3a","Type":"ContainerDied","Data":"1e98d4a6f5c9eeb25ef74ca9f7b42d04cc5f8813855e442c9b6d34c668e5fd2c"} Apr 17 08:10:26.386238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.386197 2573 scope.go:117] "RemoveContainer" containerID="847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1" Apr 17 08:10:26.387690 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.387628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" event={"ID":"beb53e6c-fc3d-4a59-af39-8aea975cd714","Type":"ContainerStarted","Data":"d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0"} Apr 17 08:10:26.393906 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.393805 2573 scope.go:117] "RemoveContainer" containerID="801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195" Apr 17 08:10:26.400599 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.400583 2573 scope.go:117] "RemoveContainer" containerID="847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1" Apr 17 08:10:26.400838 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:10:26.400821 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1\": container with ID starting with 847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1 not found: ID does not exist" containerID="847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1" Apr 17 08:10:26.400882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.400847 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1"} err="failed to get container status \"847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1\": rpc error: code = NotFound desc = could not find container \"847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1\": container with ID starting with 847518cff11fd0bb100dc199da37e42f4c743fa86f4fd22e5b9c9e7c1158ecc1 not found: ID does not exist" Apr 17 08:10:26.400882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.400865 2573 scope.go:117] "RemoveContainer" containerID="801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195" Apr 17 08:10:26.401096 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:10:26.401081 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195\": container with ID starting with 801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195 not found: ID does not exist" containerID="801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195" Apr 17 08:10:26.401135 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.401102 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195"} err="failed to get container status \"801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195\": rpc error: code = NotFound desc = could not find container \"801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195\": container with ID starting with 801b56754b078d742a0fd1bb89ec0382b2760ddc918fd6183bb78e4c6f426195 not found: ID does not exist" Apr 17 08:10:26.416969 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.416944 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm"] Apr 17 08:10:26.420697 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:26.420676 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-vx4fm"] Apr 17 08:10:27.164518 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:27.164478 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" path="/var/lib/kubelet/pods/b5657a9b-ec66-4c97-9885-f540e67adb3a/volumes" Apr 17 08:10:29.398085 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:29.398058 2573 generic.go:358] "Generic (PLEG): container finished" podID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerID="d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0" exitCode=0 Apr 17 08:10:29.398433 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:29.398107 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" event={"ID":"beb53e6c-fc3d-4a59-af39-8aea975cd714","Type":"ContainerDied","Data":"d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0"} Apr 17 08:10:30.401677 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:30.401643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" event={"ID":"beb53e6c-fc3d-4a59-af39-8aea975cd714","Type":"ContainerStarted","Data":"f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898"} Apr 17 08:10:30.402157 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:30.401942 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:30.403182 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:30.403153 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 08:10:30.418184 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:30.418146 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" podStartSLOduration=6.418133895 podStartE2EDuration="6.418133895s" podCreationTimestamp="2026-04-17 08:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:10:30.416362131 +0000 UTC m=+1157.759613593" watchObservedRunningTime="2026-04-17 08:10:30.418133895 +0000 UTC m=+1157.761385381" Apr 17 08:10:31.404882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:31.404845 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 08:10:41.405610 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:41.405524 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:44.940067 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:44.940026 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv"] Apr 17 08:10:44.940476 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:44.940274 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="kserve-container" containerID="cri-o://f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898" gracePeriod=30 Apr 17 08:10:45.012870 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.012844 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p"] Apr 17 08:10:45.013091 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.013079 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerName="storage-initializer" Apr 17 08:10:45.013139 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.013093 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerName="storage-initializer" Apr 17 08:10:45.013139 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.013109 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerName="kserve-container" Apr 17 08:10:45.013139 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.013116 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerName="kserve-container" Apr 17 08:10:45.013226 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.013158 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5657a9b-ec66-4c97-9885-f540e67adb3a" containerName="kserve-container" Apr 17 08:10:45.014822 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.014806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:10:45.027057 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.027035 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p"] Apr 17 08:10:45.114918 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.114888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d70b4cb3-0e19-479a-8be0-32a61c68909b-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p\" (UID: \"d70b4cb3-0e19-479a-8be0-32a61c68909b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:10:45.216263 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.216170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d70b4cb3-0e19-479a-8be0-32a61c68909b-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p\" (UID: \"d70b4cb3-0e19-479a-8be0-32a61c68909b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:10:45.216560 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.216539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d70b4cb3-0e19-479a-8be0-32a61c68909b-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p\" (UID: \"d70b4cb3-0e19-479a-8be0-32a61c68909b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:10:45.323650 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.323616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:10:45.445984 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.445956 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p"] Apr 17 08:10:45.497777 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:10:45.497695 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd70b4cb3_0e19_479a_8be0_32a61c68909b.slice/crio-83e483db6fa822c7d61ceaad3081818ff13ffcee5dce80fb9256fdec6c6d5015 WatchSource:0}: Error finding container 83e483db6fa822c7d61ceaad3081818ff13ffcee5dce80fb9256fdec6c6d5015: Status 404 returned error can't find the container with id 83e483db6fa822c7d61ceaad3081818ff13ffcee5dce80fb9256fdec6c6d5015 Apr 17 08:10:45.569073 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.569053 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:45.619088 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.619066 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb53e6c-fc3d-4a59-af39-8aea975cd714-kserve-provision-location\") pod \"beb53e6c-fc3d-4a59-af39-8aea975cd714\" (UID: \"beb53e6c-fc3d-4a59-af39-8aea975cd714\") " Apr 17 08:10:45.619420 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.619369 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb53e6c-fc3d-4a59-af39-8aea975cd714-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "beb53e6c-fc3d-4a59-af39-8aea975cd714" (UID: "beb53e6c-fc3d-4a59-af39-8aea975cd714"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:45.720133 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:45.720099 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb53e6c-fc3d-4a59-af39-8aea975cd714-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:10:46.445646 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.445609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" event={"ID":"d70b4cb3-0e19-479a-8be0-32a61c68909b","Type":"ContainerStarted","Data":"c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895"} Apr 17 08:10:46.446027 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.445650 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" event={"ID":"d70b4cb3-0e19-479a-8be0-32a61c68909b","Type":"ContainerStarted","Data":"83e483db6fa822c7d61ceaad3081818ff13ffcee5dce80fb9256fdec6c6d5015"} Apr 17 08:10:46.447104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.447081 2573 generic.go:358] "Generic (PLEG): container finished" podID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerID="f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898" exitCode=0 Apr 17 08:10:46.447165 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.447130 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" Apr 17 08:10:46.447165 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.447137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" event={"ID":"beb53e6c-fc3d-4a59-af39-8aea975cd714","Type":"ContainerDied","Data":"f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898"} Apr 17 08:10:46.447165 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.447156 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv" event={"ID":"beb53e6c-fc3d-4a59-af39-8aea975cd714","Type":"ContainerDied","Data":"fc1d08debef419c4a4210ebf7571fb3aeb3e28fd10efb82ae84f6586aba7961f"} Apr 17 08:10:46.447282 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.447171 2573 scope.go:117] "RemoveContainer" containerID="f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898" Apr 17 08:10:46.455076 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.455061 2573 scope.go:117] "RemoveContainer" containerID="d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0" Apr 17 08:10:46.462467 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.462445 2573 scope.go:117] "RemoveContainer" containerID="f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898" Apr 17 08:10:46.462706 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:10:46.462684 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898\": container with ID starting with f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898 not found: ID does not exist" containerID="f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898" Apr 17 08:10:46.462787 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.462749 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898"} err="failed to get container status \"f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898\": rpc error: code = NotFound desc = could not find container \"f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898\": container with ID starting with f8269f4b542a1ea31391a77cb8c90377d5758fb3263af8f3c0146de6fdb63898 not found: ID does not exist" Apr 17 08:10:46.462858 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.462775 2573 scope.go:117] "RemoveContainer" containerID="d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0" Apr 17 08:10:46.463088 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:10:46.463067 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0\": container with ID starting with d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0 not found: ID does not exist" containerID="d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0" Apr 17 08:10:46.463168 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.463093 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0"} err="failed to get container status \"d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0\": rpc error: code = NotFound desc = could not find container \"d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0\": container with ID starting with d3ed0bfbccd7d9fdfea6b8de48b0132c96e7607835f3e4a7ad90c581ace467f0 not found: ID does not exist" Apr 17 08:10:46.472373 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.472349 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv"] Apr 17 08:10:46.478191 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:46.478172 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-2tzhv"] Apr 17 08:10:47.163947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:47.163916 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" path="/var/lib/kubelet/pods/beb53e6c-fc3d-4a59-af39-8aea975cd714/volumes" Apr 17 08:10:50.461022 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:50.460987 2573 generic.go:358] "Generic (PLEG): container finished" podID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerID="c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895" exitCode=0 Apr 17 08:10:50.461373 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:50.461062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" event={"ID":"d70b4cb3-0e19-479a-8be0-32a61c68909b","Type":"ContainerDied","Data":"c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895"} Apr 17 08:10:51.465078 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:51.465042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" event={"ID":"d70b4cb3-0e19-479a-8be0-32a61c68909b","Type":"ContainerStarted","Data":"7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e"} Apr 17 08:10:51.465521 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:51.465264 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:10:51.480723 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:10:51.480685 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" podStartSLOduration=7.480670867 podStartE2EDuration="7.480670867s" podCreationTimestamp="2026-04-17 08:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:10:51.479029434 +0000 UTC m=+1178.822280896" watchObservedRunningTime="2026-04-17 08:10:51.480670867 +0000 UTC m=+1178.823922330" Apr 17 08:11:22.472700 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:22.472667 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:11:25.111687 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.111658 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p"] Apr 17 08:11:25.112057 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.111891 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerName="kserve-container" containerID="cri-o://7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e" gracePeriod=30 Apr 17 08:11:25.179038 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.179005 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs"] Apr 17 08:11:25.179267 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.179252 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="kserve-container" Apr 17 08:11:25.179318 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.179270 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="kserve-container" Apr 17 08:11:25.179318 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.179282 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="storage-initializer" Apr 17 08:11:25.179318 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.179288 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="storage-initializer" Apr 17 08:11:25.179481 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.179344 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="beb53e6c-fc3d-4a59-af39-8aea975cd714" containerName="kserve-container" Apr 17 08:11:25.182198 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.182180 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:11:25.193038 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.193015 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs"] Apr 17 08:11:25.294159 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.294124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0d45c53-3403-4618-9a11-e9105de07d13-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65779c8984-ctnrs\" (UID: \"c0d45c53-3403-4618-9a11-e9105de07d13\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:11:25.394606 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.394518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0d45c53-3403-4618-9a11-e9105de07d13-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65779c8984-ctnrs\" (UID: \"c0d45c53-3403-4618-9a11-e9105de07d13\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:11:25.394874 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.394857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0d45c53-3403-4618-9a11-e9105de07d13-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65779c8984-ctnrs\" (UID: \"c0d45c53-3403-4618-9a11-e9105de07d13\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:11:25.494122 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.494082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:11:25.623437 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:25.623369 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs"] Apr 17 08:11:25.626333 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:11:25.626298 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0d45c53_3403_4618_9a11_e9105de07d13.slice/crio-f95b59cf306ab41782ccf918969f8c4671381a45c91f3e9db58d60c11138e9d5 WatchSource:0}: Error finding container f95b59cf306ab41782ccf918969f8c4671381a45c91f3e9db58d60c11138e9d5: Status 404 returned error can't find the container with id f95b59cf306ab41782ccf918969f8c4671381a45c91f3e9db58d60c11138e9d5 Apr 17 08:11:26.255327 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.255304 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:11:26.401443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.401338 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d70b4cb3-0e19-479a-8be0-32a61c68909b-kserve-provision-location\") pod \"d70b4cb3-0e19-479a-8be0-32a61c68909b\" (UID: \"d70b4cb3-0e19-479a-8be0-32a61c68909b\") " Apr 17 08:11:26.401685 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.401658 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70b4cb3-0e19-479a-8be0-32a61c68909b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d70b4cb3-0e19-479a-8be0-32a61c68909b" (UID: "d70b4cb3-0e19-479a-8be0-32a61c68909b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:11:26.502351 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.502009 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d70b4cb3-0e19-479a-8be0-32a61c68909b-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:11:26.562686 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.562655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerStarted","Data":"f434a2a9e61f0df29b37e53460b9cb71846fdb8a658aa2206f4dd2fb9137ce0c"} Apr 17 08:11:26.562823 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.562698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerStarted","Data":"f95b59cf306ab41782ccf918969f8c4671381a45c91f3e9db58d60c11138e9d5"} Apr 17 08:11:26.564196 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.564171 2573 generic.go:358] "Generic (PLEG): container finished" podID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerID="7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e" exitCode=0 Apr 17 08:11:26.564295 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.564230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" event={"ID":"d70b4cb3-0e19-479a-8be0-32a61c68909b","Type":"ContainerDied","Data":"7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e"} Apr 17 08:11:26.564295 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.564254 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" Apr 17 08:11:26.564295 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.564261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p" event={"ID":"d70b4cb3-0e19-479a-8be0-32a61c68909b","Type":"ContainerDied","Data":"83e483db6fa822c7d61ceaad3081818ff13ffcee5dce80fb9256fdec6c6d5015"} Apr 17 08:11:26.564295 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.564281 2573 scope.go:117] "RemoveContainer" containerID="7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e" Apr 17 08:11:26.572132 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.572089 2573 scope.go:117] "RemoveContainer" containerID="c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895" Apr 17 08:11:26.578879 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.578782 2573 scope.go:117] "RemoveContainer" containerID="7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e" Apr 17 08:11:26.579214 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:11:26.579154 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e\": container with ID starting with 7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e not found: ID does not exist" containerID="7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e" Apr 17 08:11:26.579361 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.579213 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e"} err="failed to get container status \"7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e\": rpc error: code = NotFound desc = could not find container \"7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e\": container with ID starting with 7c781117e9c011ffe5643377364106610aebca6ea2dc40c1cd2c0900088ec09e not found: ID does not exist" Apr 17 08:11:26.579361 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.579238 2573 scope.go:117] "RemoveContainer" containerID="c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895" Apr 17 08:11:26.579537 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:11:26.579508 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895\": container with ID starting with c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895 not found: ID does not exist" containerID="c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895" Apr 17 08:11:26.579599 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.579543 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895"} err="failed to get container status \"c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895\": rpc error: code = NotFound desc = could not find container \"c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895\": container with ID starting with c4518c20291be9e2e362e28c7ad624865819f1186313f783bc24877b4794f895 not found: ID does not exist" Apr 17 08:11:26.589586 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.589565 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p"] Apr 17 08:11:26.595145 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:26.595125 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-9zs9p"] Apr 17 08:11:27.163753 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:27.163724 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" path="/var/lib/kubelet/pods/d70b4cb3-0e19-479a-8be0-32a61c68909b/volumes" Apr 17 08:11:29.575564 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:29.575489 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0d45c53-3403-4618-9a11-e9105de07d13" containerID="f434a2a9e61f0df29b37e53460b9cb71846fdb8a658aa2206f4dd2fb9137ce0c" exitCode=0 Apr 17 08:11:29.575892 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:29.575570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerDied","Data":"f434a2a9e61f0df29b37e53460b9cb71846fdb8a658aa2206f4dd2fb9137ce0c"} Apr 17 08:11:30.580138 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:30.580104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerStarted","Data":"5cc7b938daf06b2068e11341d0bfd77119f5be0d5218865623f2b30140cf2f3e"} Apr 17 08:11:32.586157 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:32.586128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerStarted","Data":"f61b68aee5136b781ab83eb962bf0c49fd99a21568d9d2634a2694f72f532222"} Apr 17 08:11:32.586487 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:32.586289 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:11:32.605064 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:32.605024 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" podStartSLOduration=4.743103232 podStartE2EDuration="7.605011346s" podCreationTimestamp="2026-04-17 08:11:25 +0000 UTC" firstStartedPulling="2026-04-17 08:11:29.634597113 +0000 UTC m=+1216.977848555" lastFinishedPulling="2026-04-17 08:11:32.496505227 +0000 UTC m=+1219.839756669" observedRunningTime="2026-04-17 08:11:32.603599611 +0000 UTC m=+1219.946851070" watchObservedRunningTime="2026-04-17 08:11:32.605011346 +0000 UTC m=+1219.948262809" Apr 17 08:11:33.589365 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:11:33.589334 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:12:04.595435 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:04.595314 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:12:34.596922 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:34.596881 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:12:35.301863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.301828 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs"] Apr 17 08:12:35.302214 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.302167 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" containerID="cri-o://5cc7b938daf06b2068e11341d0bfd77119f5be0d5218865623f2b30140cf2f3e" gracePeriod=30 Apr 17 08:12:35.302329 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.302242 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-agent" containerID="cri-o://f61b68aee5136b781ab83eb962bf0c49fd99a21568d9d2634a2694f72f532222" gracePeriod=30 Apr 17 08:12:35.336677 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.336651 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7"] Apr 17 08:12:35.336897 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.336883 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerName="kserve-container" Apr 17 08:12:35.336939 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.336901 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerName="kserve-container" Apr 17 08:12:35.336939 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.336915 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerName="storage-initializer" Apr 17 08:12:35.336939 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.336924 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerName="storage-initializer" Apr 17 08:12:35.337031 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.336969 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d70b4cb3-0e19-479a-8be0-32a61c68909b" containerName="kserve-container" Apr 17 08:12:35.339244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.339229 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:12:35.347122 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.347098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7"] Apr 17 08:12:35.366138 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.366116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3107b7fd-3241-464d-94a9-85521a5f350f-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-lfxb7\" (UID: \"3107b7fd-3241-464d-94a9-85521a5f350f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:12:35.467244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.467217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3107b7fd-3241-464d-94a9-85521a5f350f-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-lfxb7\" (UID: \"3107b7fd-3241-464d-94a9-85521a5f350f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:12:35.467556 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.467541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3107b7fd-3241-464d-94a9-85521a5f350f-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-lfxb7\" (UID: \"3107b7fd-3241-464d-94a9-85521a5f350f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:12:35.649539 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.649466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:12:35.766147 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:35.766120 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7"] Apr 17 08:12:35.768673 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:12:35.768644 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3107b7fd_3241_464d_94a9_85521a5f350f.slice/crio-bedf10b268490d4d486e5d365882a99c43ab6c9ea20cf257b444b0f544fc0fe6 WatchSource:0}: Error finding container bedf10b268490d4d486e5d365882a99c43ab6c9ea20cf257b444b0f544fc0fe6: Status 404 returned error can't find the container with id bedf10b268490d4d486e5d365882a99c43ab6c9ea20cf257b444b0f544fc0fe6 Apr 17 08:12:36.755254 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:36.755220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" event={"ID":"3107b7fd-3241-464d-94a9-85521a5f350f","Type":"ContainerStarted","Data":"983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a"} Apr 17 08:12:36.755254 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:36.755256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" event={"ID":"3107b7fd-3241-464d-94a9-85521a5f350f","Type":"ContainerStarted","Data":"bedf10b268490d4d486e5d365882a99c43ab6c9ea20cf257b444b0f544fc0fe6"} Apr 17 08:12:37.759992 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:37.759958 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0d45c53-3403-4618-9a11-e9105de07d13" containerID="5cc7b938daf06b2068e11341d0bfd77119f5be0d5218865623f2b30140cf2f3e" exitCode=0 Apr 17 08:12:37.760329 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:37.760035 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerDied","Data":"5cc7b938daf06b2068e11341d0bfd77119f5be0d5218865623f2b30140cf2f3e"} Apr 17 08:12:40.775398 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:40.775345 2573 generic.go:358] "Generic (PLEG): container finished" podID="3107b7fd-3241-464d-94a9-85521a5f350f" containerID="983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a" exitCode=0 Apr 17 08:12:40.775841 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:40.775431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" event={"ID":"3107b7fd-3241-464d-94a9-85521a5f350f","Type":"ContainerDied","Data":"983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a"} Apr 17 08:12:44.592267 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:44.592222 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 08:12:52.810498 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:52.810459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" event={"ID":"3107b7fd-3241-464d-94a9-85521a5f350f","Type":"ContainerStarted","Data":"ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94"} Apr 17 08:12:52.810955 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:52.810762 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:12:52.811754 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:52.811731 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:12:52.830107 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:52.830059 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podStartSLOduration=6.398204839 podStartE2EDuration="17.830046502s" podCreationTimestamp="2026-04-17 08:12:35 +0000 UTC" firstStartedPulling="2026-04-17 08:12:40.776717853 +0000 UTC m=+1288.119969295" lastFinishedPulling="2026-04-17 08:12:52.208559508 +0000 UTC m=+1299.551810958" observedRunningTime="2026-04-17 08:12:52.828629772 +0000 UTC m=+1300.171881235" watchObservedRunningTime="2026-04-17 08:12:52.830046502 +0000 UTC m=+1300.173297955" Apr 17 08:12:53.814168 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:53.814133 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:12:54.592665 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:12:54.592623 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 08:13:03.815002 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:03.814954 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:13:04.592685 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:04.592646 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 08:13:04.592863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:04.592772 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:13:05.846076 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:05.846043 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0d45c53-3403-4618-9a11-e9105de07d13" containerID="f61b68aee5136b781ab83eb962bf0c49fd99a21568d9d2634a2694f72f532222" exitCode=0 Apr 17 08:13:05.846548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:05.846088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerDied","Data":"f61b68aee5136b781ab83eb962bf0c49fd99a21568d9d2634a2694f72f532222"} Apr 17 08:13:05.938456 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:05.938427 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:13:05.991027 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:05.991004 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0d45c53-3403-4618-9a11-e9105de07d13-kserve-provision-location\") pod \"c0d45c53-3403-4618-9a11-e9105de07d13\" (UID: \"c0d45c53-3403-4618-9a11-e9105de07d13\") " Apr 17 08:13:05.991309 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:05.991286 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d45c53-3403-4618-9a11-e9105de07d13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0d45c53-3403-4618-9a11-e9105de07d13" (UID: "c0d45c53-3403-4618-9a11-e9105de07d13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:13:06.091530 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.091477 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0d45c53-3403-4618-9a11-e9105de07d13-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:13:06.850577 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.850555 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" Apr 17 08:13:06.850975 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.850549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs" event={"ID":"c0d45c53-3403-4618-9a11-e9105de07d13","Type":"ContainerDied","Data":"f95b59cf306ab41782ccf918969f8c4671381a45c91f3e9db58d60c11138e9d5"} Apr 17 08:13:06.850975 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.850672 2573 scope.go:117] "RemoveContainer" containerID="f61b68aee5136b781ab83eb962bf0c49fd99a21568d9d2634a2694f72f532222" Apr 17 08:13:06.858184 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.858035 2573 scope.go:117] "RemoveContainer" containerID="5cc7b938daf06b2068e11341d0bfd77119f5be0d5218865623f2b30140cf2f3e" Apr 17 08:13:06.866721 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.866708 2573 scope.go:117] "RemoveContainer" containerID="f434a2a9e61f0df29b37e53460b9cb71846fdb8a658aa2206f4dd2fb9137ce0c" Apr 17 08:13:06.870960 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.870940 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs"] Apr 17 08:13:06.873658 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:06.873637 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-ctnrs"] Apr 17 08:13:07.163555 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:07.163495 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" path="/var/lib/kubelet/pods/c0d45c53-3403-4618-9a11-e9105de07d13/volumes" Apr 17 08:13:13.814546 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:13.814497 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:13:23.815021 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:23.814977 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:13:33.815610 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:33.815537 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:13:36.748439 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.748408 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7"] Apr 17 08:13:36.749004 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.748754 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" containerID="cri-o://ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94" gracePeriod=30 Apr 17 08:13:36.818693 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818661 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n"] Apr 17 08:13:36.818905 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818894 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="storage-initializer" Apr 17 08:13:36.818947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818908 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="storage-initializer" Apr 17 08:13:36.818947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818916 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-agent" Apr 17 08:13:36.818947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818921 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-agent" Apr 17 08:13:36.818947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818934 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" Apr 17 08:13:36.818947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818940 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" Apr 17 08:13:36.819095 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818981 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-agent" Apr 17 08:13:36.819095 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.818990 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0d45c53-3403-4618-9a11-e9105de07d13" containerName="kserve-container" Apr 17 08:13:36.837627 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.837586 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n"] Apr 17 08:13:36.837786 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:36.837711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:13:37.004681 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.004609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32135b50-a622-48fb-adf1-ce1fc9aab48a-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-59d7n\" (UID: \"32135b50-a622-48fb-adf1-ce1fc9aab48a\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:13:37.105260 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.105226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32135b50-a622-48fb-adf1-ce1fc9aab48a-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-59d7n\" (UID: \"32135b50-a622-48fb-adf1-ce1fc9aab48a\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:13:37.105617 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.105598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32135b50-a622-48fb-adf1-ce1fc9aab48a-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-59d7n\" (UID: \"32135b50-a622-48fb-adf1-ce1fc9aab48a\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:13:37.148422 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.148392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:13:37.264756 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.264732 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n"] Apr 17 08:13:37.267297 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:13:37.267266 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32135b50_a622_48fb_adf1_ce1fc9aab48a.slice/crio-7a7592e26c643777f8949427e618f479281a2f37b2f0679d1423b031d7c9044e WatchSource:0}: Error finding container 7a7592e26c643777f8949427e618f479281a2f37b2f0679d1423b031d7c9044e: Status 404 returned error can't find the container with id 7a7592e26c643777f8949427e618f479281a2f37b2f0679d1423b031d7c9044e Apr 17 08:13:37.933268 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.933235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" event={"ID":"32135b50-a622-48fb-adf1-ce1fc9aab48a","Type":"ContainerStarted","Data":"c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3"} Apr 17 08:13:37.933268 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:37.933267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" event={"ID":"32135b50-a622-48fb-adf1-ce1fc9aab48a","Type":"ContainerStarted","Data":"7a7592e26c643777f8949427e618f479281a2f37b2f0679d1423b031d7c9044e"} Apr 17 08:13:39.190031 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.190010 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:13:39.320848 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.320790 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3107b7fd-3241-464d-94a9-85521a5f350f-kserve-provision-location\") pod \"3107b7fd-3241-464d-94a9-85521a5f350f\" (UID: \"3107b7fd-3241-464d-94a9-85521a5f350f\") " Apr 17 08:13:39.329881 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.329857 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3107b7fd-3241-464d-94a9-85521a5f350f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3107b7fd-3241-464d-94a9-85521a5f350f" (UID: "3107b7fd-3241-464d-94a9-85521a5f350f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:13:39.421375 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.421353 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3107b7fd-3241-464d-94a9-85521a5f350f-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:13:39.940535 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.940497 2573 generic.go:358] "Generic (PLEG): container finished" podID="3107b7fd-3241-464d-94a9-85521a5f350f" containerID="ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94" exitCode=0 Apr 17 08:13:39.940708 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.940593 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" Apr 17 08:13:39.940708 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.940586 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" event={"ID":"3107b7fd-3241-464d-94a9-85521a5f350f","Type":"ContainerDied","Data":"ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94"} Apr 17 08:13:39.940819 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.940710 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7" event={"ID":"3107b7fd-3241-464d-94a9-85521a5f350f","Type":"ContainerDied","Data":"bedf10b268490d4d486e5d365882a99c43ab6c9ea20cf257b444b0f544fc0fe6"} Apr 17 08:13:39.940819 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.940727 2573 scope.go:117] "RemoveContainer" containerID="ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94" Apr 17 08:13:39.948893 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.948875 2573 scope.go:117] "RemoveContainer" containerID="983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a" Apr 17 08:13:39.955539 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.955523 2573 scope.go:117] "RemoveContainer" containerID="ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94" Apr 17 08:13:39.955769 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:13:39.955751 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94\": container with ID starting with ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94 not found: ID does not exist" containerID="ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94" Apr 17 08:13:39.955820 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.955778 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94"} err="failed to get container status \"ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94\": rpc error: code = NotFound desc = could not find container \"ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94\": container with ID starting with ffc0d858bc830ff9bb1864a09d4af4ed0cfced299ca8cc1dabd727d5bba54b94 not found: ID does not exist" Apr 17 08:13:39.955820 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.955794 2573 scope.go:117] "RemoveContainer" containerID="983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a" Apr 17 08:13:39.956004 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:13:39.955988 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a\": container with ID starting with 983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a not found: ID does not exist" containerID="983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a" Apr 17 08:13:39.956040 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.956012 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a"} err="failed to get container status \"983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a\": rpc error: code = NotFound desc = could not find container \"983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a\": container with ID starting with 983e3798b5d57bacaa26771542fabc152fe7ce984ea12944dfbc3c38ef52304a not found: ID does not exist" Apr 17 08:13:39.960002 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.959980 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7"] Apr 17 08:13:39.963648 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:39.963628 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lfxb7"] Apr 17 08:13:41.163490 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:41.163457 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" path="/var/lib/kubelet/pods/3107b7fd-3241-464d-94a9-85521a5f350f/volumes" Apr 17 08:13:41.948126 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:41.948093 2573 generic.go:358] "Generic (PLEG): container finished" podID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerID="c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3" exitCode=0 Apr 17 08:13:41.948310 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:41.948148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" event={"ID":"32135b50-a622-48fb-adf1-ce1fc9aab48a","Type":"ContainerDied","Data":"c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3"} Apr 17 08:13:42.952639 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:42.952598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" event={"ID":"32135b50-a622-48fb-adf1-ce1fc9aab48a","Type":"ContainerStarted","Data":"f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142"} Apr 17 08:13:42.953026 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:42.952883 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:13:42.954122 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:42.954095 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:13:42.969495 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:42.969453 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podStartSLOduration=6.969440869 podStartE2EDuration="6.969440869s" podCreationTimestamp="2026-04-17 08:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:13:42.968181301 +0000 UTC m=+1350.311432763" watchObservedRunningTime="2026-04-17 08:13:42.969440869 +0000 UTC m=+1350.312692331" Apr 17 08:13:43.955683 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:43.955645 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:13:53.956279 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:13:53.956235 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:14:03.956030 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:03.955987 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:14:13.955933 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:13.955888 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:14:23.956885 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:23.956851 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:14:28.226269 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.226237 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n"] Apr 17 08:14:28.226757 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.226603 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" containerID="cri-o://f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142" gracePeriod=30 Apr 17 08:14:28.311063 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.311033 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh"] Apr 17 08:14:28.311285 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.311274 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="storage-initializer" Apr 17 08:14:28.311328 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.311288 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="storage-initializer" Apr 17 08:14:28.311328 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.311296 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" Apr 17 08:14:28.311328 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.311302 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" Apr 17 08:14:28.311438 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.311341 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3107b7fd-3241-464d-94a9-85521a5f350f" containerName="kserve-container" Apr 17 08:14:28.313765 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.313747 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:14:28.323169 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.323145 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh"] Apr 17 08:14:28.354461 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.354438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/373488b9-5793-430d-91c4-1c01b4c013b0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh\" (UID: \"373488b9-5793-430d-91c4-1c01b4c013b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:14:28.455596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.455563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/373488b9-5793-430d-91c4-1c01b4c013b0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh\" (UID: \"373488b9-5793-430d-91c4-1c01b4c013b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:14:28.455888 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.455873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/373488b9-5793-430d-91c4-1c01b4c013b0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh\" (UID: \"373488b9-5793-430d-91c4-1c01b4c013b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:14:28.623582 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.623498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:14:28.738167 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:28.738137 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh"] Apr 17 08:14:28.741741 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:14:28.741716 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod373488b9_5793_430d_91c4_1c01b4c013b0.slice/crio-51f801491fe969c8ebd785cb99656903e4c158c5067a31fb048bdf9cb193873e WatchSource:0}: Error finding container 51f801491fe969c8ebd785cb99656903e4c158c5067a31fb048bdf9cb193873e: Status 404 returned error can't find the container with id 51f801491fe969c8ebd785cb99656903e4c158c5067a31fb048bdf9cb193873e Apr 17 08:14:29.074937 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:29.074899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" event={"ID":"373488b9-5793-430d-91c4-1c01b4c013b0","Type":"ContainerStarted","Data":"581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87"} Apr 17 08:14:29.074937 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:29.074938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" event={"ID":"373488b9-5793-430d-91c4-1c01b4c013b0","Type":"ContainerStarted","Data":"51f801491fe969c8ebd785cb99656903e4c158c5067a31fb048bdf9cb193873e"} Apr 17 08:14:30.682044 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:30.682020 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:14:30.770756 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:30.770697 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32135b50-a622-48fb-adf1-ce1fc9aab48a-kserve-provision-location\") pod \"32135b50-a622-48fb-adf1-ce1fc9aab48a\" (UID: \"32135b50-a622-48fb-adf1-ce1fc9aab48a\") " Apr 17 08:14:30.780457 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:30.780428 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32135b50-a622-48fb-adf1-ce1fc9aab48a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "32135b50-a622-48fb-adf1-ce1fc9aab48a" (UID: "32135b50-a622-48fb-adf1-ce1fc9aab48a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:14:30.871977 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:30.871943 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32135b50-a622-48fb-adf1-ce1fc9aab48a-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:14:31.081893 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.081832 2573 generic.go:358] "Generic (PLEG): container finished" podID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerID="f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142" exitCode=0 Apr 17 08:14:31.081893 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.081878 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" event={"ID":"32135b50-a622-48fb-adf1-ce1fc9aab48a","Type":"ContainerDied","Data":"f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142"} Apr 17 08:14:31.082015 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.081904 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" Apr 17 08:14:31.082015 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.081913 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n" event={"ID":"32135b50-a622-48fb-adf1-ce1fc9aab48a","Type":"ContainerDied","Data":"7a7592e26c643777f8949427e618f479281a2f37b2f0679d1423b031d7c9044e"} Apr 17 08:14:31.082015 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.081930 2573 scope.go:117] "RemoveContainer" containerID="f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142" Apr 17 08:14:31.090016 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.089999 2573 scope.go:117] "RemoveContainer" containerID="c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3" Apr 17 08:14:31.097109 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.097089 2573 scope.go:117] "RemoveContainer" containerID="f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142" Apr 17 08:14:31.097365 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:14:31.097347 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142\": container with ID starting with f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142 not found: ID does not exist" containerID="f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142" Apr 17 08:14:31.097438 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.097374 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142"} err="failed to get container status \"f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142\": rpc error: code = NotFound desc = could not find container \"f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142\": container with ID starting with f21e3d53340acea00a4ba2d3ecc7689893acbd62922ce6926a4d547080c23142 not found: ID does not exist" Apr 17 08:14:31.097438 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.097416 2573 scope.go:117] "RemoveContainer" containerID="c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3" Apr 17 08:14:31.097664 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:14:31.097647 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3\": container with ID starting with c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3 not found: ID does not exist" containerID="c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3" Apr 17 08:14:31.097707 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.097668 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3"} err="failed to get container status \"c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3\": rpc error: code = NotFound desc = could not find container \"c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3\": container with ID starting with c63fe14a0f7f8e0eb090aa5132bdcda4e19554a1fe826a05995b7d77ff03c3d3 not found: ID does not exist" Apr 17 08:14:31.101655 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.101633 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n"] Apr 17 08:14:31.104884 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.104863 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-59d7n"] Apr 17 08:14:31.163558 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:31.163536 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" path="/var/lib/kubelet/pods/32135b50-a622-48fb-adf1-ce1fc9aab48a/volumes" Apr 17 08:14:34.095496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:34.095460 2573 generic.go:358] "Generic (PLEG): container finished" podID="373488b9-5793-430d-91c4-1c01b4c013b0" containerID="581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87" exitCode=0 Apr 17 08:14:34.095496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:34.095499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" event={"ID":"373488b9-5793-430d-91c4-1c01b4c013b0","Type":"ContainerDied","Data":"581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87"} Apr 17 08:14:35.099826 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:35.099787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" event={"ID":"373488b9-5793-430d-91c4-1c01b4c013b0","Type":"ContainerStarted","Data":"ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0"} Apr 17 08:14:35.100200 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:35.100157 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:14:35.101279 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:35.101247 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:14:35.116301 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:35.116263 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podStartSLOduration=7.11625168 podStartE2EDuration="7.11625168s" podCreationTimestamp="2026-04-17 08:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:14:35.114999775 +0000 UTC m=+1402.458251238" watchObservedRunningTime="2026-04-17 08:14:35.11625168 +0000 UTC m=+1402.459503144" Apr 17 08:14:36.103658 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:36.103619 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:14:46.103903 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:46.103861 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:14:56.103699 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:14:56.103654 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:15:06.104039 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:06.103954 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:15:16.105428 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:16.105393 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:15:20.049787 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.049755 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh"] Apr 17 08:15:20.050143 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.050011 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" containerID="cri-o://ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0" gracePeriod=30 Apr 17 08:15:20.125031 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.124998 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn"] Apr 17 08:15:20.125241 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.125230 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="storage-initializer" Apr 17 08:15:20.125241 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.125242 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="storage-initializer" Apr 17 08:15:20.125327 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.125252 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" Apr 17 08:15:20.125327 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.125258 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" Apr 17 08:15:20.125327 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.125301 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="32135b50-a622-48fb-adf1-ce1fc9aab48a" containerName="kserve-container" Apr 17 08:15:20.127168 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.127152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:15:20.134967 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.134939 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn"] Apr 17 08:15:20.223856 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.223831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa15de7-b551-4981-ac9f-aea711be06ff-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-bp5zn\" (UID: \"1aa15de7-b551-4981-ac9f-aea711be06ff\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:15:20.325195 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.325117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa15de7-b551-4981-ac9f-aea711be06ff-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-bp5zn\" (UID: \"1aa15de7-b551-4981-ac9f-aea711be06ff\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:15:20.325487 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.325471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa15de7-b551-4981-ac9f-aea711be06ff-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-bp5zn\" (UID: \"1aa15de7-b551-4981-ac9f-aea711be06ff\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:15:20.437511 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.437462 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:15:20.553307 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:20.553278 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn"] Apr 17 08:15:20.556264 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:15:20.556231 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa15de7_b551_4981_ac9f_aea711be06ff.slice/crio-977326d58c436e1fbe0c5118a3284dfc8dc86207b63590697957a68545a7daff WatchSource:0}: Error finding container 977326d58c436e1fbe0c5118a3284dfc8dc86207b63590697957a68545a7daff: Status 404 returned error can't find the container with id 977326d58c436e1fbe0c5118a3284dfc8dc86207b63590697957a68545a7daff Apr 17 08:15:21.223345 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:21.223305 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" event={"ID":"1aa15de7-b551-4981-ac9f-aea711be06ff","Type":"ContainerStarted","Data":"fe54ab72110d6b04f73162ba409d33ed7b418aaa1673e6440bc2c2f9afdc8b33"} Apr 17 08:15:21.223345 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:21.223341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" event={"ID":"1aa15de7-b551-4981-ac9f-aea711be06ff","Type":"ContainerStarted","Data":"977326d58c436e1fbe0c5118a3284dfc8dc86207b63590697957a68545a7daff"} Apr 17 08:15:22.592571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:22.592535 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:15:22.742826 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:22.742792 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/373488b9-5793-430d-91c4-1c01b4c013b0-kserve-provision-location\") pod \"373488b9-5793-430d-91c4-1c01b4c013b0\" (UID: \"373488b9-5793-430d-91c4-1c01b4c013b0\") " Apr 17 08:15:22.751477 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:22.751453 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/373488b9-5793-430d-91c4-1c01b4c013b0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "373488b9-5793-430d-91c4-1c01b4c013b0" (UID: "373488b9-5793-430d-91c4-1c01b4c013b0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:15:22.843148 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:22.843122 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/373488b9-5793-430d-91c4-1c01b4c013b0-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:15:23.229330 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.229301 2573 generic.go:358] "Generic (PLEG): container finished" podID="373488b9-5793-430d-91c4-1c01b4c013b0" containerID="ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0" exitCode=0 Apr 17 08:15:23.229448 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.229369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" event={"ID":"373488b9-5793-430d-91c4-1c01b4c013b0","Type":"ContainerDied","Data":"ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0"} Apr 17 08:15:23.229448 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.229371 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" Apr 17 08:15:23.229448 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.229412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh" event={"ID":"373488b9-5793-430d-91c4-1c01b4c013b0","Type":"ContainerDied","Data":"51f801491fe969c8ebd785cb99656903e4c158c5067a31fb048bdf9cb193873e"} Apr 17 08:15:23.229448 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.229428 2573 scope.go:117] "RemoveContainer" containerID="ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0" Apr 17 08:15:23.236962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.236946 2573 scope.go:117] "RemoveContainer" containerID="581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87" Apr 17 08:15:23.243809 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.243792 2573 scope.go:117] "RemoveContainer" containerID="ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0" Apr 17 08:15:23.243942 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.243920 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh"] Apr 17 08:15:23.244114 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:15:23.244097 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0\": container with ID starting with ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0 not found: ID does not exist" containerID="ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0" Apr 17 08:15:23.244150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.244123 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0"} err="failed to get container status \"ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0\": rpc error: code = NotFound desc = could not find container \"ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0\": container with ID starting with ce55a64165abcd9bd34c97392e6a444b977da6e76a1522b562a2e3f7739785d0 not found: ID does not exist" Apr 17 08:15:23.244150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.244140 2573 scope.go:117] "RemoveContainer" containerID="581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87" Apr 17 08:15:23.244415 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:15:23.244372 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87\": container with ID starting with 581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87 not found: ID does not exist" containerID="581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87" Apr 17 08:15:23.244515 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.244419 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87"} err="failed to get container status \"581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87\": rpc error: code = NotFound desc = could not find container \"581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87\": container with ID starting with 581c39fb1cf06c299bfe6817ec883d47d24819711ca9d56f41aea73ec692bb87 not found: ID does not exist" Apr 17 08:15:23.247928 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:23.247908 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-g7pjh"] Apr 17 08:15:24.234363 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:24.234328 2573 generic.go:358] "Generic (PLEG): container finished" podID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerID="fe54ab72110d6b04f73162ba409d33ed7b418aaa1673e6440bc2c2f9afdc8b33" exitCode=0 Apr 17 08:15:24.234692 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:24.234401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" event={"ID":"1aa15de7-b551-4981-ac9f-aea711be06ff","Type":"ContainerDied","Data":"fe54ab72110d6b04f73162ba409d33ed7b418aaa1673e6440bc2c2f9afdc8b33"} Apr 17 08:15:25.166833 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:25.166404 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" path="/var/lib/kubelet/pods/373488b9-5793-430d-91c4-1c01b4c013b0/volumes" Apr 17 08:15:31.256315 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:31.256285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" event={"ID":"1aa15de7-b551-4981-ac9f-aea711be06ff","Type":"ContainerStarted","Data":"81f5a52b2563f147c0bc91093b193b92ddba2fb80b48f6b5ebcb59906561b25f"} Apr 17 08:15:31.256701 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:31.256595 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:15:31.257723 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:31.257700 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:15:31.271416 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:31.271334 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podStartSLOduration=4.400885751 podStartE2EDuration="11.271322814s" podCreationTimestamp="2026-04-17 08:15:20 +0000 UTC" firstStartedPulling="2026-04-17 08:15:24.235442807 +0000 UTC m=+1451.578694248" lastFinishedPulling="2026-04-17 08:15:31.105879871 +0000 UTC m=+1458.449131311" observedRunningTime="2026-04-17 08:15:31.269969315 +0000 UTC m=+1458.613220778" watchObservedRunningTime="2026-04-17 08:15:31.271322814 +0000 UTC m=+1458.614574280" Apr 17 08:15:32.260032 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:32.259996 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:15:42.260794 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:42.260744 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:15:52.260314 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:15:52.260269 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:16:02.260155 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:16:02.260103 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:16:12.260576 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:16:12.260532 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:16:22.261053 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:16:22.261005 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:16:32.260494 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:16:32.260405 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:16:42.260976 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:16:42.260933 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:16:52.261562 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:16:52.261529 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:17:01.359820 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.359785 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn"] Apr 17 08:17:01.360256 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.360029 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" containerID="cri-o://81f5a52b2563f147c0bc91093b193b92ddba2fb80b48f6b5ebcb59906561b25f" gracePeriod=30 Apr 17 08:17:01.434997 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.434969 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh"] Apr 17 08:17:01.435208 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.435196 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" Apr 17 08:17:01.435208 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.435209 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" Apr 17 08:17:01.435292 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.435227 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="storage-initializer" Apr 17 08:17:01.435292 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.435232 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="storage-initializer" Apr 17 08:17:01.435292 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.435272 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="373488b9-5793-430d-91c4-1c01b4c013b0" containerName="kserve-container" Apr 17 08:17:01.438114 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.438095 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:17:01.444860 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.444800 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh"] Apr 17 08:17:01.494839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.494819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a671e7b2-992e-48ff-abdf-04e71b448612-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wk4lh\" (UID: \"a671e7b2-992e-48ff-abdf-04e71b448612\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:17:01.596029 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.595996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a671e7b2-992e-48ff-abdf-04e71b448612-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wk4lh\" (UID: \"a671e7b2-992e-48ff-abdf-04e71b448612\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:17:01.596341 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.596323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a671e7b2-992e-48ff-abdf-04e71b448612-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wk4lh\" (UID: \"a671e7b2-992e-48ff-abdf-04e71b448612\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:17:01.748441 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.748417 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:17:01.862472 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.862443 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh"] Apr 17 08:17:01.865553 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:17:01.865525 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda671e7b2_992e_48ff_abdf_04e71b448612.slice/crio-4fc6ba639b9aa29dae37ec86c6ef2ec66dc2f7609371389963f26ea8168914ea WatchSource:0}: Error finding container 4fc6ba639b9aa29dae37ec86c6ef2ec66dc2f7609371389963f26ea8168914ea: Status 404 returned error can't find the container with id 4fc6ba639b9aa29dae37ec86c6ef2ec66dc2f7609371389963f26ea8168914ea Apr 17 08:17:01.867437 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:01.867416 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:17:02.261078 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:02.261034 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:17:02.494244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:02.494211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" event={"ID":"a671e7b2-992e-48ff-abdf-04e71b448612","Type":"ContainerStarted","Data":"a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6"} Apr 17 08:17:02.494244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:02.494247 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" event={"ID":"a671e7b2-992e-48ff-abdf-04e71b448612","Type":"ContainerStarted","Data":"4fc6ba639b9aa29dae37ec86c6ef2ec66dc2f7609371389963f26ea8168914ea"} Apr 17 08:17:04.502263 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:04.502229 2573 generic.go:358] "Generic (PLEG): container finished" podID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerID="81f5a52b2563f147c0bc91093b193b92ddba2fb80b48f6b5ebcb59906561b25f" exitCode=0 Apr 17 08:17:04.502570 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:04.502295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" event={"ID":"1aa15de7-b551-4981-ac9f-aea711be06ff","Type":"ContainerDied","Data":"81f5a52b2563f147c0bc91093b193b92ddba2fb80b48f6b5ebcb59906561b25f"} Apr 17 08:17:04.597611 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:04.597548 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:17:04.714124 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:04.714100 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa15de7-b551-4981-ac9f-aea711be06ff-kserve-provision-location\") pod \"1aa15de7-b551-4981-ac9f-aea711be06ff\" (UID: \"1aa15de7-b551-4981-ac9f-aea711be06ff\") " Apr 17 08:17:04.714420 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:04.714401 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa15de7-b551-4981-ac9f-aea711be06ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1aa15de7-b551-4981-ac9f-aea711be06ff" (UID: "1aa15de7-b551-4981-ac9f-aea711be06ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:17:04.814501 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:04.814480 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa15de7-b551-4981-ac9f-aea711be06ff-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:17:05.506877 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.506838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" event={"ID":"1aa15de7-b551-4981-ac9f-aea711be06ff","Type":"ContainerDied","Data":"977326d58c436e1fbe0c5118a3284dfc8dc86207b63590697957a68545a7daff"} Apr 17 08:17:05.507235 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.506898 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn" Apr 17 08:17:05.507235 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.506900 2573 scope.go:117] "RemoveContainer" containerID="81f5a52b2563f147c0bc91093b193b92ddba2fb80b48f6b5ebcb59906561b25f" Apr 17 08:17:05.508494 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.508467 2573 generic.go:358] "Generic (PLEG): container finished" podID="a671e7b2-992e-48ff-abdf-04e71b448612" containerID="a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6" exitCode=0 Apr 17 08:17:05.508585 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.508524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" event={"ID":"a671e7b2-992e-48ff-abdf-04e71b448612","Type":"ContainerDied","Data":"a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6"} Apr 17 08:17:05.514960 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.514941 2573 scope.go:117] "RemoveContainer" containerID="fe54ab72110d6b04f73162ba409d33ed7b418aaa1673e6440bc2c2f9afdc8b33" Apr 17 08:17:05.521890 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.521844 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn"] Apr 17 08:17:05.528594 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:05.528573 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-bp5zn"] Apr 17 08:17:06.514322 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:06.514275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" event={"ID":"a671e7b2-992e-48ff-abdf-04e71b448612","Type":"ContainerStarted","Data":"9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6"} Apr 17 08:17:06.514747 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:06.514614 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:17:06.515994 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:06.515967 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:17:06.529861 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:06.529801 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podStartSLOduration=5.529787704 podStartE2EDuration="5.529787704s" podCreationTimestamp="2026-04-17 08:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:17:06.528958186 +0000 UTC m=+1553.872209648" watchObservedRunningTime="2026-04-17 08:17:06.529787704 +0000 UTC m=+1553.873039169" Apr 17 08:17:07.164442 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:07.164409 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" path="/var/lib/kubelet/pods/1aa15de7-b551-4981-ac9f-aea711be06ff/volumes" Apr 17 08:17:07.517293 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:07.517263 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:17:17.518181 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:17.518141 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:17:27.518060 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:27.518015 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:17:37.517660 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:37.517617 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:17:47.517569 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:47.517526 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:17:57.517446 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:17:57.517404 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:18:07.517645 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:07.517604 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:18:13.161480 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:13.161441 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:18:23.163523 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:23.163492 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:18:32.461195 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.461164 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh"] Apr 17 08:18:32.461585 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.461445 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" containerID="cri-o://9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6" gracePeriod=30 Apr 17 08:18:32.563566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.563533 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf"] Apr 17 08:18:32.563787 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.563775 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" Apr 17 08:18:32.563832 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.563789 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" Apr 17 08:18:32.563832 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.563798 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="storage-initializer" Apr 17 08:18:32.563832 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.563804 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="storage-initializer" Apr 17 08:18:32.563925 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.563851 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aa15de7-b551-4981-ac9f-aea711be06ff" containerName="kserve-container" Apr 17 08:18:32.566768 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.566751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:18:32.576411 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.576365 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf"] Apr 17 08:18:32.610895 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.610864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ab37c01-b5a5-44c3-990d-4e21466ca45a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf\" (UID: \"7ab37c01-b5a5-44c3-990d-4e21466ca45a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:18:32.712214 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.712146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ab37c01-b5a5-44c3-990d-4e21466ca45a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf\" (UID: \"7ab37c01-b5a5-44c3-990d-4e21466ca45a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:18:32.712519 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.712503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ab37c01-b5a5-44c3-990d-4e21466ca45a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf\" (UID: \"7ab37c01-b5a5-44c3-990d-4e21466ca45a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:18:32.877312 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.877277 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:18:32.994206 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:32.994181 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf"] Apr 17 08:18:32.996726 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:18:32.996692 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ab37c01_b5a5_44c3_990d_4e21466ca45a.slice/crio-d87f1eca9e9dd82160f09dfcc0ee53b3010e68803392767c97464fc7831134db WatchSource:0}: Error finding container d87f1eca9e9dd82160f09dfcc0ee53b3010e68803392767c97464fc7831134db: Status 404 returned error can't find the container with id d87f1eca9e9dd82160f09dfcc0ee53b3010e68803392767c97464fc7831134db Apr 17 08:18:33.161402 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:33.161348 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:18:33.747268 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:33.747228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" event={"ID":"7ab37c01-b5a5-44c3-990d-4e21466ca45a","Type":"ContainerStarted","Data":"a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54"} Apr 17 08:18:33.747683 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:33.747281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" event={"ID":"7ab37c01-b5a5-44c3-990d-4e21466ca45a","Type":"ContainerStarted","Data":"d87f1eca9e9dd82160f09dfcc0ee53b3010e68803392767c97464fc7831134db"} Apr 17 08:18:35.899437 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:35.899411 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:18:35.935985 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:35.935962 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a671e7b2-992e-48ff-abdf-04e71b448612-kserve-provision-location\") pod \"a671e7b2-992e-48ff-abdf-04e71b448612\" (UID: \"a671e7b2-992e-48ff-abdf-04e71b448612\") " Apr 17 08:18:35.936240 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:35.936217 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a671e7b2-992e-48ff-abdf-04e71b448612-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a671e7b2-992e-48ff-abdf-04e71b448612" (UID: "a671e7b2-992e-48ff-abdf-04e71b448612"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:18:36.036880 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.036826 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a671e7b2-992e-48ff-abdf-04e71b448612-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:18:36.756155 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.756126 2573 generic.go:358] "Generic (PLEG): container finished" podID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerID="a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54" exitCode=0 Apr 17 08:18:36.756340 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.756189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" event={"ID":"7ab37c01-b5a5-44c3-990d-4e21466ca45a","Type":"ContainerDied","Data":"a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54"} Apr 17 08:18:36.757774 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.757748 2573 generic.go:358] "Generic (PLEG): container finished" podID="a671e7b2-992e-48ff-abdf-04e71b448612" containerID="9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6" exitCode=0 Apr 17 08:18:36.757863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.757786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" event={"ID":"a671e7b2-992e-48ff-abdf-04e71b448612","Type":"ContainerDied","Data":"9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6"} Apr 17 08:18:36.757863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.757810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" event={"ID":"a671e7b2-992e-48ff-abdf-04e71b448612","Type":"ContainerDied","Data":"4fc6ba639b9aa29dae37ec86c6ef2ec66dc2f7609371389963f26ea8168914ea"} Apr 17 08:18:36.757863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.757816 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh" Apr 17 08:18:36.757863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.757828 2573 scope.go:117] "RemoveContainer" containerID="9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6" Apr 17 08:18:36.768475 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.768455 2573 scope.go:117] "RemoveContainer" containerID="a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6" Apr 17 08:18:36.780483 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.780464 2573 scope.go:117] "RemoveContainer" containerID="9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6" Apr 17 08:18:36.780756 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:18:36.780739 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6\": container with ID starting with 9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6 not found: ID does not exist" containerID="9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6" Apr 17 08:18:36.780827 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.780763 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6"} err="failed to get container status \"9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6\": rpc error: code = NotFound desc = could not find container \"9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6\": container with ID starting with 9f804fe2e451a3cf57e41e52408b9805b20fc56d007132c277d0ff079e606fa6 not found: ID does not exist" Apr 17 08:18:36.780827 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.780780 2573 scope.go:117] "RemoveContainer" containerID="a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6" Apr 17 08:18:36.781025 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:18:36.781007 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6\": container with ID starting with a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6 not found: ID does not exist" containerID="a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6" Apr 17 08:18:36.781075 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.781032 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6"} err="failed to get container status \"a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6\": rpc error: code = NotFound desc = could not find container \"a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6\": container with ID starting with a765efe4421467d2d07ed53d2bd51e426305f9babd5787eb05154e214574c9b6 not found: ID does not exist" Apr 17 08:18:36.784230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.784204 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh"] Apr 17 08:18:36.790182 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:36.790161 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wk4lh"] Apr 17 08:18:37.163611 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:37.163537 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" path="/var/lib/kubelet/pods/a671e7b2-992e-48ff-abdf-04e71b448612/volumes" Apr 17 08:18:37.764856 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:37.764821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" event={"ID":"7ab37c01-b5a5-44c3-990d-4e21466ca45a","Type":"ContainerStarted","Data":"8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9"} Apr 17 08:18:37.765153 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:37.765131 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:18:37.766406 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:37.766363 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:18:37.781499 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:37.781456 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podStartSLOduration=5.781445214 podStartE2EDuration="5.781445214s" podCreationTimestamp="2026-04-17 08:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:18:37.779983773 +0000 UTC m=+1645.123235249" watchObservedRunningTime="2026-04-17 08:18:37.781445214 +0000 UTC m=+1645.124696677" Apr 17 08:18:38.768397 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:38.768332 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:18:48.769231 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:48.769188 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:18:58.768620 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:18:58.768580 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:19:08.769005 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:19:08.768958 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:19:18.768721 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:19:18.768680 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:19:28.769085 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:19:28.769041 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:19:38.768842 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:19:38.768731 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:19:48.768965 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:19:48.768917 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:19:58.769583 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:19:58.769547 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:20:03.562904 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.562868 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf"] Apr 17 08:20:03.563371 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.563146 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" containerID="cri-o://8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9" gracePeriod=30 Apr 17 08:20:03.637443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.637408 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x"] Apr 17 08:20:03.637716 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.637699 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="storage-initializer" Apr 17 08:20:03.637790 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.637719 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="storage-initializer" Apr 17 08:20:03.637790 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.637737 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" Apr 17 08:20:03.637790 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.637746 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" Apr 17 08:20:03.637947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.637812 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a671e7b2-992e-48ff-abdf-04e71b448612" containerName="kserve-container" Apr 17 08:20:03.640652 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.640632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:20:03.647427 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.647409 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x"] Apr 17 08:20:03.738473 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.738445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20-kserve-provision-location\") pod \"isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x\" (UID: \"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20\") " pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:20:03.839642 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.839559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20-kserve-provision-location\") pod \"isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x\" (UID: \"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20\") " pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:20:03.839916 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.839898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20-kserve-provision-location\") pod \"isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x\" (UID: \"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20\") " pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:20:03.951306 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:03.951263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:20:04.065327 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:04.065306 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x"] Apr 17 08:20:04.068100 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:20:04.068061 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2f346c_ff4d_4726_91ee_abb2cdd5ef20.slice/crio-ed6f45bc869c73b766eef36411f160279a9b0956ac9700cbd15e89ffee742670 WatchSource:0}: Error finding container ed6f45bc869c73b766eef36411f160279a9b0956ac9700cbd15e89ffee742670: Status 404 returned error can't find the container with id ed6f45bc869c73b766eef36411f160279a9b0956ac9700cbd15e89ffee742670 Apr 17 08:20:04.997426 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:04.997373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" event={"ID":"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20","Type":"ContainerStarted","Data":"c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6"} Apr 17 08:20:04.997426 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:04.997428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" event={"ID":"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20","Type":"ContainerStarted","Data":"ed6f45bc869c73b766eef36411f160279a9b0956ac9700cbd15e89ffee742670"} Apr 17 08:20:06.703113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:06.703089 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:20:06.862022 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:06.861949 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ab37c01-b5a5-44c3-990d-4e21466ca45a-kserve-provision-location\") pod \"7ab37c01-b5a5-44c3-990d-4e21466ca45a\" (UID: \"7ab37c01-b5a5-44c3-990d-4e21466ca45a\") " Apr 17 08:20:06.862232 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:06.862208 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab37c01-b5a5-44c3-990d-4e21466ca45a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ab37c01-b5a5-44c3-990d-4e21466ca45a" (UID: "7ab37c01-b5a5-44c3-990d-4e21466ca45a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:20:06.962793 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:06.962764 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ab37c01-b5a5-44c3-990d-4e21466ca45a-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:20:07.003473 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.003445 2573 generic.go:358] "Generic (PLEG): container finished" podID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerID="8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9" exitCode=0 Apr 17 08:20:07.003566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.003509 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" Apr 17 08:20:07.003566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.003523 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" event={"ID":"7ab37c01-b5a5-44c3-990d-4e21466ca45a","Type":"ContainerDied","Data":"8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9"} Apr 17 08:20:07.003566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.003563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf" event={"ID":"7ab37c01-b5a5-44c3-990d-4e21466ca45a","Type":"ContainerDied","Data":"d87f1eca9e9dd82160f09dfcc0ee53b3010e68803392767c97464fc7831134db"} Apr 17 08:20:07.003693 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.003579 2573 scope.go:117] "RemoveContainer" containerID="8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9" Apr 17 08:20:07.011536 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.011514 2573 scope.go:117] "RemoveContainer" containerID="a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54" Apr 17 08:20:07.018413 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.018373 2573 scope.go:117] "RemoveContainer" containerID="8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9" Apr 17 08:20:07.018665 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:20:07.018644 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9\": container with ID starting with 8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9 not found: ID does not exist" containerID="8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9" Apr 17 08:20:07.018704 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.018673 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9"} err="failed to get container status \"8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9\": rpc error: code = NotFound desc = could not find container \"8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9\": container with ID starting with 8a052314ea3b1218a01f61d64d6dccb4ed889cda465c8fb2cf274a94a0b807d9 not found: ID does not exist" Apr 17 08:20:07.018704 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.018691 2573 scope.go:117] "RemoveContainer" containerID="a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54" Apr 17 08:20:07.018926 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:20:07.018909 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54\": container with ID starting with a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54 not found: ID does not exist" containerID="a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54" Apr 17 08:20:07.018968 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.018931 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54"} err="failed to get container status \"a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54\": rpc error: code = NotFound desc = could not find container \"a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54\": container with ID starting with a01a719a158c005aa6d2cc4f61b8ae038e99b46b6e810c812ca8ab35efdafd54 not found: ID does not exist" Apr 17 08:20:07.022506 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.022483 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf"] Apr 17 08:20:07.028939 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.028912 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-hhwvf"] Apr 17 08:20:07.163266 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:07.163207 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" path="/var/lib/kubelet/pods/7ab37c01-b5a5-44c3-990d-4e21466ca45a/volumes" Apr 17 08:20:08.007851 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:08.007820 2573 generic.go:358] "Generic (PLEG): container finished" podID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerID="c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6" exitCode=0 Apr 17 08:20:08.008165 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:08.007855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" event={"ID":"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20","Type":"ContainerDied","Data":"c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6"} Apr 17 08:20:09.012057 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:09.012026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" event={"ID":"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20","Type":"ContainerStarted","Data":"2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386"} Apr 17 08:20:09.012434 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:09.012412 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:20:09.013407 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:09.013364 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:20:09.027990 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:09.027947 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podStartSLOduration=6.027934875 podStartE2EDuration="6.027934875s" podCreationTimestamp="2026-04-17 08:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:20:09.026392295 +0000 UTC m=+1736.369643749" watchObservedRunningTime="2026-04-17 08:20:09.027934875 +0000 UTC m=+1736.371186337" Apr 17 08:20:10.015057 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:10.015015 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:20:20.016048 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:20.016005 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:20:30.015894 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:30.015857 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:20:40.015609 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:40.015566 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:20:50.015370 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:20:50.015327 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:21:00.015602 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:00.015560 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:21:10.016335 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:10.016307 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:21:13.771970 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.771934 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2"] Apr 17 08:21:13.772453 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.772193 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="storage-initializer" Apr 17 08:21:13.772453 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.772204 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="storage-initializer" Apr 17 08:21:13.772453 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.772225 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" Apr 17 08:21:13.772453 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.772231 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" Apr 17 08:21:13.772453 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.772268 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ab37c01-b5a5-44c3-990d-4e21466ca45a" containerName="kserve-container" Apr 17 08:21:13.775076 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.775056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:13.777253 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.777230 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 08:21:13.777353 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.777262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-28dd60-dockercfg-grplz\"" Apr 17 08:21:13.777353 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.777282 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-28dd60\"" Apr 17 08:21:13.783538 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.783515 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2"] Apr 17 08:21:13.914297 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.914260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5204d8df-af7f-4c33-8500-b1e1ccc196eb-cabundle-cert\") pod \"isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:13.914482 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:13.914304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5204d8df-af7f-4c33-8500-b1e1ccc196eb-kserve-provision-location\") pod \"isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:14.015052 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:14.015023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5204d8df-af7f-4c33-8500-b1e1ccc196eb-cabundle-cert\") pod \"isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:14.015206 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:14.015058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5204d8df-af7f-4c33-8500-b1e1ccc196eb-kserve-provision-location\") pod \"isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:14.015405 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:14.015365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5204d8df-af7f-4c33-8500-b1e1ccc196eb-kserve-provision-location\") pod \"isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:14.015691 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:14.015668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5204d8df-af7f-4c33-8500-b1e1ccc196eb-cabundle-cert\") pod \"isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:14.085869 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:14.085820 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:14.200031 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:14.200003 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2"] Apr 17 08:21:14.204299 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:21:14.204273 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5204d8df_af7f_4c33_8500_b1e1ccc196eb.slice/crio-3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf WatchSource:0}: Error finding container 3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf: Status 404 returned error can't find the container with id 3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf Apr 17 08:21:15.190497 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:15.190460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" event={"ID":"5204d8df-af7f-4c33-8500-b1e1ccc196eb","Type":"ContainerStarted","Data":"35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653"} Apr 17 08:21:15.190497 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:15.190493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" event={"ID":"5204d8df-af7f-4c33-8500-b1e1ccc196eb","Type":"ContainerStarted","Data":"3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf"} Apr 17 08:21:17.197566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:17.197536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/0.log" Apr 17 08:21:17.197960 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:17.197574 2573 generic.go:358] "Generic (PLEG): container finished" podID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerID="35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653" exitCode=1 Apr 17 08:21:17.197960 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:17.197654 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" event={"ID":"5204d8df-af7f-4c33-8500-b1e1ccc196eb","Type":"ContainerDied","Data":"35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653"} Apr 17 08:21:18.202109 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:18.202082 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/0.log" Apr 17 08:21:18.202493 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:18.202151 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" event={"ID":"5204d8df-af7f-4c33-8500-b1e1ccc196eb","Type":"ContainerStarted","Data":"918f4c6c4a87e5e68e57919ca72a1fd63b037d8e8b46b13929b568eed3bac07c"} Apr 17 08:21:21.210619 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.210593 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/1.log" Apr 17 08:21:21.211031 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.210961 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/0.log" Apr 17 08:21:21.211031 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.210996 2573 generic.go:358] "Generic (PLEG): container finished" podID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerID="918f4c6c4a87e5e68e57919ca72a1fd63b037d8e8b46b13929b568eed3bac07c" exitCode=1 Apr 17 08:21:21.211106 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.211052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" event={"ID":"5204d8df-af7f-4c33-8500-b1e1ccc196eb","Type":"ContainerDied","Data":"918f4c6c4a87e5e68e57919ca72a1fd63b037d8e8b46b13929b568eed3bac07c"} Apr 17 08:21:21.211106 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.211087 2573 scope.go:117] "RemoveContainer" containerID="35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653" Apr 17 08:21:21.211427 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.211405 2573 scope.go:117] "RemoveContainer" containerID="35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653" Apr 17 08:21:21.221418 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:21:21.221363 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_kserve-ci-e2e-test_5204d8df-af7f-4c33-8500-b1e1ccc196eb_0 in pod sandbox 3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf from index: no such id: '35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653'" containerID="35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653" Apr 17 08:21:21.221558 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:21.221426 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_kserve-ci-e2e-test_5204d8df-af7f-4c33-8500-b1e1ccc196eb_0 in pod sandbox 3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf from index: no such id: '35d7345583639c4d4c71166821ce17e2dc3bfc7e985aed4bc7d0e9bcd59a0653'" Apr 17 08:21:21.221629 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:21:21.221607 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_kserve-ci-e2e-test(5204d8df-af7f-4c33-8500-b1e1ccc196eb)\"" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" Apr 17 08:21:22.215415 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:22.215365 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/1.log" Apr 17 08:21:27.838079 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.838047 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2"] Apr 17 08:21:27.882343 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.882313 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x"] Apr 17 08:21:27.882699 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.882674 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" containerID="cri-o://2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386" gracePeriod=30 Apr 17 08:21:27.956974 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.956762 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg"] Apr 17 08:21:27.961305 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.961285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:27.963931 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.963908 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-eb7a51\"" Apr 17 08:21:27.964051 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.963939 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-eb7a51-dockercfg-v57vd\"" Apr 17 08:21:27.967329 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.967312 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/1.log" Apr 17 08:21:27.967455 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.967368 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:27.968960 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:27.968942 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg"] Apr 17 08:21:28.106776 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.106672 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5204d8df-af7f-4c33-8500-b1e1ccc196eb-kserve-provision-location\") pod \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " Apr 17 08:21:28.106776 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.106760 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5204d8df-af7f-4c33-8500-b1e1ccc196eb-cabundle-cert\") pod \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\" (UID: \"5204d8df-af7f-4c33-8500-b1e1ccc196eb\") " Apr 17 08:21:28.106968 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.106912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-kserve-provision-location\") pod \"isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.106968 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.106952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-cabundle-cert\") pod \"isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.107051 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.106965 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5204d8df-af7f-4c33-8500-b1e1ccc196eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5204d8df-af7f-4c33-8500-b1e1ccc196eb" (UID: "5204d8df-af7f-4c33-8500-b1e1ccc196eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:21:28.107051 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.107045 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5204d8df-af7f-4c33-8500-b1e1ccc196eb-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:21:28.107126 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.107077 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5204d8df-af7f-4c33-8500-b1e1ccc196eb-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5204d8df-af7f-4c33-8500-b1e1ccc196eb" (UID: "5204d8df-af7f-4c33-8500-b1e1ccc196eb"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:21:28.207462 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.207428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-kserve-provision-location\") pod \"isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.207573 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.207479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-cabundle-cert\") pod \"isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.207573 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.207552 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5204d8df-af7f-4c33-8500-b1e1ccc196eb-cabundle-cert\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:21:28.207900 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.207878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-kserve-provision-location\") pod \"isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.208282 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.208266 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-cabundle-cert\") pod \"isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.231456 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.231435 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2_5204d8df-af7f-4c33-8500-b1e1ccc196eb/storage-initializer/1.log" Apr 17 08:21:28.231548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.231519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" event={"ID":"5204d8df-af7f-4c33-8500-b1e1ccc196eb","Type":"ContainerDied","Data":"3ae2732a24d3be7747b9a2a1ba4d57afa004e4a1e2d73a3d8ee3d9b80fd340cf"} Apr 17 08:21:28.231595 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.231546 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2" Apr 17 08:21:28.231595 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.231552 2573 scope.go:117] "RemoveContainer" containerID="918f4c6c4a87e5e68e57919ca72a1fd63b037d8e8b46b13929b568eed3bac07c" Apr 17 08:21:28.265412 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.265366 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2"] Apr 17 08:21:28.268867 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.268846 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-28dd60-predictor-69cdbf87b5-5gmg2"] Apr 17 08:21:28.275711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.275694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:28.390569 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:28.390545 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg"] Apr 17 08:21:28.393076 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:21:28.393039 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6942f8a_1117_4fe0_a0df_2b7596eda2e9.slice/crio-c90011115bbf4994688016934a884fd88049fb7035b46b44f019bbfbdbceeec5 WatchSource:0}: Error finding container c90011115bbf4994688016934a884fd88049fb7035b46b44f019bbfbdbceeec5: Status 404 returned error can't find the container with id c90011115bbf4994688016934a884fd88049fb7035b46b44f019bbfbdbceeec5 Apr 17 08:21:29.163574 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:29.163541 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" path="/var/lib/kubelet/pods/5204d8df-af7f-4c33-8500-b1e1ccc196eb/volumes" Apr 17 08:21:29.235839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:29.235810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" event={"ID":"c6942f8a-1117-4fe0-a0df-2b7596eda2e9","Type":"ContainerStarted","Data":"dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5"} Apr 17 08:21:29.235839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:29.235838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" event={"ID":"c6942f8a-1117-4fe0-a0df-2b7596eda2e9","Type":"ContainerStarted","Data":"c90011115bbf4994688016934a884fd88049fb7035b46b44f019bbfbdbceeec5"} Apr 17 08:21:30.015391 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:30.015347 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:21:31.611266 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:31.611244 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:21:31.732598 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:31.732571 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20-kserve-provision-location\") pod \"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20\" (UID: \"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20\") " Apr 17 08:21:31.732861 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:31.732839 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" (UID: "ed2f346c-ff4d-4726-91ee-abb2cdd5ef20"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:21:31.833415 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:31.833369 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:21:32.248860 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.248822 2573 generic.go:358] "Generic (PLEG): container finished" podID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerID="2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386" exitCode=0 Apr 17 08:21:32.249034 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.248891 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" Apr 17 08:21:32.249034 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.248914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" event={"ID":"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20","Type":"ContainerDied","Data":"2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386"} Apr 17 08:21:32.249034 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.248956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x" event={"ID":"ed2f346c-ff4d-4726-91ee-abb2cdd5ef20","Type":"ContainerDied","Data":"ed6f45bc869c73b766eef36411f160279a9b0956ac9700cbd15e89ffee742670"} Apr 17 08:21:32.249034 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.248974 2573 scope.go:117] "RemoveContainer" containerID="2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386" Apr 17 08:21:32.257011 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.256995 2573 scope.go:117] "RemoveContainer" containerID="c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6" Apr 17 08:21:32.264023 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.264008 2573 scope.go:117] "RemoveContainer" containerID="2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386" Apr 17 08:21:32.264249 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:21:32.264228 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386\": container with ID starting with 2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386 not found: ID does not exist" containerID="2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386" Apr 17 08:21:32.264319 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.264261 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386"} err="failed to get container status \"2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386\": rpc error: code = NotFound desc = could not find container \"2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386\": container with ID starting with 2ced1f33bc6c99512226ad63933e4573382ea7d6c2e108945c884033c7287386 not found: ID does not exist" Apr 17 08:21:32.264319 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.264284 2573 scope.go:117] "RemoveContainer" containerID="c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6" Apr 17 08:21:32.264566 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:21:32.264548 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6\": container with ID starting with c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6 not found: ID does not exist" containerID="c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6" Apr 17 08:21:32.264606 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.264572 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6"} err="failed to get container status \"c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6\": rpc error: code = NotFound desc = could not find container \"c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6\": container with ID starting with c28a70f777f0e48d79ef671ae353e49d2835bdf32131cb504a0ab7e230b5cea6 not found: ID does not exist" Apr 17 08:21:32.268577 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.268558 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x"] Apr 17 08:21:32.271691 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:32.271671 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-28dd60-predictor-6c6fbb8f88-xg52x"] Apr 17 08:21:33.166889 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:33.166841 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" path="/var/lib/kubelet/pods/ed2f346c-ff4d-4726-91ee-abb2cdd5ef20/volumes" Apr 17 08:21:34.258446 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:34.258420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg_c6942f8a-1117-4fe0-a0df-2b7596eda2e9/storage-initializer/0.log" Apr 17 08:21:34.258856 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:34.258455 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerID="dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5" exitCode=1 Apr 17 08:21:34.258856 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:34.258515 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" event={"ID":"c6942f8a-1117-4fe0-a0df-2b7596eda2e9","Type":"ContainerDied","Data":"dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5"} Apr 17 08:21:35.262876 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:35.262845 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg_c6942f8a-1117-4fe0-a0df-2b7596eda2e9/storage-initializer/0.log" Apr 17 08:21:35.263300 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:35.262947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" event={"ID":"c6942f8a-1117-4fe0-a0df-2b7596eda2e9","Type":"ContainerStarted","Data":"fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9"} Apr 17 08:21:37.969410 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:37.969364 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg"] Apr 17 08:21:37.969836 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:37.969687 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" containerID="cri-o://fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9" gracePeriod=30 Apr 17 08:21:38.108545 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.108527 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg_c6942f8a-1117-4fe0-a0df-2b7596eda2e9/storage-initializer/1.log" Apr 17 08:21:38.108884 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.108867 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg_c6942f8a-1117-4fe0-a0df-2b7596eda2e9/storage-initializer/0.log" Apr 17 08:21:38.108979 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.108943 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:38.145799 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.145770 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp"] Apr 17 08:21:38.146048 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146032 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerName="storage-initializer" Apr 17 08:21:38.146113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146051 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerName="storage-initializer" Apr 17 08:21:38.146113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146068 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" Apr 17 08:21:38.146113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146076 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" Apr 17 08:21:38.146113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146090 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" Apr 17 08:21:38.146113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146099 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" Apr 17 08:21:38.146113 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146112 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="storage-initializer" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146121 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="storage-initializer" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146139 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146148 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146206 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146219 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerName="storage-initializer" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146233 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed2f346c-ff4d-4726-91ee-abb2cdd5ef20" containerName="kserve-container" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146369 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerName="storage-initializer" Apr 17 08:21:38.146449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146399 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerName="storage-initializer" Apr 17 08:21:38.146809 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146470 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerName="storage-initializer" Apr 17 08:21:38.146809 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.146483 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5204d8df-af7f-4c33-8500-b1e1ccc196eb" containerName="storage-initializer" Apr 17 08:21:38.149375 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.149356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:21:38.151532 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.151517 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9pcqp\"" Apr 17 08:21:38.156903 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.156880 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp"] Apr 17 08:21:38.173361 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.173336 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-cabundle-cert\") pod \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " Apr 17 08:21:38.173478 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.173448 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-kserve-provision-location\") pod \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\" (UID: \"c6942f8a-1117-4fe0-a0df-2b7596eda2e9\") " Apr 17 08:21:38.173655 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.173636 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c6942f8a-1117-4fe0-a0df-2b7596eda2e9" (UID: "c6942f8a-1117-4fe0-a0df-2b7596eda2e9"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:21:38.173705 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.173667 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6942f8a-1117-4fe0-a0df-2b7596eda2e9" (UID: "c6942f8a-1117-4fe0-a0df-2b7596eda2e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:21:38.273277 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.273138 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg_c6942f8a-1117-4fe0-a0df-2b7596eda2e9/storage-initializer/1.log" Apr 17 08:21:38.274330 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274283 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg_c6942f8a-1117-4fe0-a0df-2b7596eda2e9/storage-initializer/0.log" Apr 17 08:21:38.274452 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274326 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" containerID="fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9" exitCode=1 Apr 17 08:21:38.274452 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-gb9jp\" (UID: \"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:21:38.274452 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274431 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" Apr 17 08:21:38.274588 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274461 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-cabundle-cert\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:21:38.274588 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274485 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6942f8a-1117-4fe0-a0df-2b7596eda2e9-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:21:38.274588 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" event={"ID":"c6942f8a-1117-4fe0-a0df-2b7596eda2e9","Type":"ContainerDied","Data":"fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9"} Apr 17 08:21:38.274588 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274567 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg" event={"ID":"c6942f8a-1117-4fe0-a0df-2b7596eda2e9","Type":"ContainerDied","Data":"c90011115bbf4994688016934a884fd88049fb7035b46b44f019bbfbdbceeec5"} Apr 17 08:21:38.274776 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.274592 2573 scope.go:117] "RemoveContainer" containerID="fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9" Apr 17 08:21:38.282857 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.282839 2573 scope.go:117] "RemoveContainer" containerID="dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5" Apr 17 08:21:38.289412 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.289395 2573 scope.go:117] "RemoveContainer" containerID="fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9" Apr 17 08:21:38.289635 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:21:38.289620 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9\": container with ID starting with fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9 not found: ID does not exist" containerID="fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9" Apr 17 08:21:38.289695 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.289642 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9"} err="failed to get container status \"fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9\": rpc error: code = NotFound desc = could not find container \"fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9\": container with ID starting with fcdcfbd95a5882eaf9ca565d7f760322a1f95db8e148c1ef765dfb51e41cb3f9 not found: ID does not exist" Apr 17 08:21:38.289695 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.289657 2573 scope.go:117] "RemoveContainer" containerID="dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5" Apr 17 08:21:38.289872 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:21:38.289856 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5\": container with ID starting with dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5 not found: ID does not exist" containerID="dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5" Apr 17 08:21:38.289912 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.289875 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5"} err="failed to get container status \"dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5\": rpc error: code = NotFound desc = could not find container \"dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5\": container with ID starting with dc3c1a678e199cad7da76e49c53d8a4f1c4a713f459f535fc3b9e832c4ffa8d5 not found: ID does not exist" Apr 17 08:21:38.306225 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.306194 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg"] Apr 17 08:21:38.307643 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.307617 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-eb7a51-predictor-d8dc459b5-6tjdg"] Apr 17 08:21:38.375056 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.375035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-gb9jp\" (UID: \"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:21:38.375355 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.375338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-gb9jp\" (UID: \"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:21:38.459798 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.459775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:21:38.575333 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:38.575302 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp"] Apr 17 08:21:38.578589 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:21:38.578560 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a98ba4c_2a5e_4154_9da5_ac2dc3f1e9fb.slice/crio-43ff8effd22c51362383e1287b7fbc2cf68d6257df268f0eff0ec583b2e97061 WatchSource:0}: Error finding container 43ff8effd22c51362383e1287b7fbc2cf68d6257df268f0eff0ec583b2e97061: Status 404 returned error can't find the container with id 43ff8effd22c51362383e1287b7fbc2cf68d6257df268f0eff0ec583b2e97061 Apr 17 08:21:39.163519 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:39.163484 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6942f8a-1117-4fe0-a0df-2b7596eda2e9" path="/var/lib/kubelet/pods/c6942f8a-1117-4fe0-a0df-2b7596eda2e9/volumes" Apr 17 08:21:39.278571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:39.278539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" event={"ID":"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb","Type":"ContainerStarted","Data":"f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51"} Apr 17 08:21:39.278571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:39.278571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" event={"ID":"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb","Type":"ContainerStarted","Data":"43ff8effd22c51362383e1287b7fbc2cf68d6257df268f0eff0ec583b2e97061"} Apr 17 08:21:43.289692 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:43.289655 2573 generic.go:358] "Generic (PLEG): container finished" podID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerID="f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51" exitCode=0 Apr 17 08:21:43.290063 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:21:43.289731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" event={"ID":"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb","Type":"ContainerDied","Data":"f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51"} Apr 17 08:22:04.354494 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:04.354463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" event={"ID":"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb","Type":"ContainerStarted","Data":"0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e"} Apr 17 08:22:04.354884 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:04.354738 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:22:04.355964 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:04.355942 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:22:04.371321 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:04.371267 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podStartSLOduration=5.847254943 podStartE2EDuration="26.371251087s" podCreationTimestamp="2026-04-17 08:21:38 +0000 UTC" firstStartedPulling="2026-04-17 08:21:43.290831221 +0000 UTC m=+1830.634082661" lastFinishedPulling="2026-04-17 08:22:03.814827363 +0000 UTC m=+1851.158078805" observedRunningTime="2026-04-17 08:22:04.370160295 +0000 UTC m=+1851.713411774" watchObservedRunningTime="2026-04-17 08:22:04.371251087 +0000 UTC m=+1851.714502552" Apr 17 08:22:05.357543 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:05.357503 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:22:15.357893 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:15.357849 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:22:25.357807 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:25.357768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:22:35.357838 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:35.357746 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:22:45.357466 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:45.357426 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:22:55.358027 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:22:55.357983 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:23:05.357938 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:05.357896 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:23:15.358547 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:15.358515 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:23:18.274728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.274684 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp"] Apr 17 08:23:18.275203 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.275147 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" containerID="cri-o://0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e" gracePeriod=30 Apr 17 08:23:18.360842 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.360812 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f"] Apr 17 08:23:18.363866 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.363849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:23:18.371042 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.371018 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f"] Apr 17 08:23:18.442421 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.442363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69e6b66-3dd9-4242-8b6f-f57e27e95555-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f\" (UID: \"c69e6b66-3dd9-4242-8b6f-f57e27e95555\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:23:18.543095 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.543009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69e6b66-3dd9-4242-8b6f-f57e27e95555-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f\" (UID: \"c69e6b66-3dd9-4242-8b6f-f57e27e95555\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:23:18.543356 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.543335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69e6b66-3dd9-4242-8b6f-f57e27e95555-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f\" (UID: \"c69e6b66-3dd9-4242-8b6f-f57e27e95555\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:23:18.674209 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.674168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:23:18.789715 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.789686 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f"] Apr 17 08:23:18.791638 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:23:18.791609 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69e6b66_3dd9_4242_8b6f_f57e27e95555.slice/crio-ffda463a840932101e556a31677d92e7db6cb7a59854be8ddda8ed43274e5891 WatchSource:0}: Error finding container ffda463a840932101e556a31677d92e7db6cb7a59854be8ddda8ed43274e5891: Status 404 returned error can't find the container with id ffda463a840932101e556a31677d92e7db6cb7a59854be8ddda8ed43274e5891 Apr 17 08:23:18.793404 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:18.793340 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:23:19.547081 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:19.547047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" event={"ID":"c69e6b66-3dd9-4242-8b6f-f57e27e95555","Type":"ContainerStarted","Data":"eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af"} Apr 17 08:23:19.547081 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:19.547083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" event={"ID":"c69e6b66-3dd9-4242-8b6f-f57e27e95555","Type":"ContainerStarted","Data":"ffda463a840932101e556a31677d92e7db6cb7a59854be8ddda8ed43274e5891"} Apr 17 08:23:22.424235 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.424211 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:23:22.472985 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.472913 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb-kserve-provision-location\") pod \"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb\" (UID: \"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb\") " Apr 17 08:23:22.473249 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.473226 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" (UID: "9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:23:22.557036 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.557003 2573 generic.go:358] "Generic (PLEG): container finished" podID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerID="0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e" exitCode=0 Apr 17 08:23:22.557205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.557075 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" event={"ID":"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb","Type":"ContainerDied","Data":"0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e"} Apr 17 08:23:22.557205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.557086 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" Apr 17 08:23:22.557205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.557110 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp" event={"ID":"9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb","Type":"ContainerDied","Data":"43ff8effd22c51362383e1287b7fbc2cf68d6257df268f0eff0ec583b2e97061"} Apr 17 08:23:22.557205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.557132 2573 scope.go:117] "RemoveContainer" containerID="0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e" Apr 17 08:23:22.565252 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.565227 2573 scope.go:117] "RemoveContainer" containerID="f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51" Apr 17 08:23:22.572044 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.572025 2573 scope.go:117] "RemoveContainer" containerID="0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e" Apr 17 08:23:22.572343 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:23:22.572278 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e\": container with ID starting with 0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e not found: ID does not exist" containerID="0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e" Apr 17 08:23:22.572343 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.572305 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e"} err="failed to get container status \"0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e\": rpc error: code = NotFound desc = could not find container \"0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e\": container with ID starting with 0c8f46a1da9bc6a4d5d568c7e489667053b80e103a7ac5302c7d801b26a7a23e not found: ID does not exist" Apr 17 08:23:22.572343 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.572323 2573 scope.go:117] "RemoveContainer" containerID="f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51" Apr 17 08:23:22.572580 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:23:22.572565 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51\": container with ID starting with f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51 not found: ID does not exist" containerID="f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51" Apr 17 08:23:22.572618 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.572583 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51"} err="failed to get container status \"f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51\": rpc error: code = NotFound desc = could not find container \"f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51\": container with ID starting with f2d978f93c22044d4b1207b03e3892947b2ac4718b05c4f7db85abc5a9192f51 not found: ID does not exist" Apr 17 08:23:22.573923 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.573903 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:23:22.577131 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.577110 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp"] Apr 17 08:23:22.583160 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:22.583136 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-gb9jp"] Apr 17 08:23:23.164425 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:23.164120 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" path="/var/lib/kubelet/pods/9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb/volumes" Apr 17 08:23:23.561261 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:23.561232 2573 generic.go:358] "Generic (PLEG): container finished" podID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerID="eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af" exitCode=0 Apr 17 08:23:23.561684 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:23.561311 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" event={"ID":"c69e6b66-3dd9-4242-8b6f-f57e27e95555","Type":"ContainerDied","Data":"eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af"} Apr 17 08:23:24.566083 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:24.566052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" event={"ID":"c69e6b66-3dd9-4242-8b6f-f57e27e95555","Type":"ContainerStarted","Data":"f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7"} Apr 17 08:23:24.566496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:24.566353 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:23:24.567642 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:24.567611 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:23:24.583922 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:24.583883 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podStartSLOduration=6.583853396 podStartE2EDuration="6.583853396s" podCreationTimestamp="2026-04-17 08:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:23:24.582788139 +0000 UTC m=+1931.926039629" watchObservedRunningTime="2026-04-17 08:23:24.583853396 +0000 UTC m=+1931.927104858" Apr 17 08:23:25.568743 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:25.568708 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:23:35.569533 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:35.569487 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:23:45.568741 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:45.568698 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:23:55.569632 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:23:55.569590 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:24:05.569189 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:05.569098 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:24:15.568869 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:15.568827 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:24:25.569717 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:25.569676 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 08:24:35.569576 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:35.569547 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:24:38.477021 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.476991 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f"] Apr 17 08:24:38.477406 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.477223 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" containerID="cri-o://f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7" gracePeriod=30 Apr 17 08:24:38.532915 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.532884 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx"] Apr 17 08:24:38.533127 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.533115 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" Apr 17 08:24:38.533171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.533129 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" Apr 17 08:24:38.533171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.533147 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="storage-initializer" Apr 17 08:24:38.533171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.533152 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="storage-initializer" Apr 17 08:24:38.533262 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.533190 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a98ba4c-2a5e-4154-9da5-ac2dc3f1e9fb" containerName="kserve-container" Apr 17 08:24:38.535967 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.535952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:24:38.543178 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.543159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx"] Apr 17 08:24:38.688371 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.688337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6078b8-c8e4-4e72-9a04-c4a95054d563-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-bltpx\" (UID: \"dd6078b8-c8e4-4e72-9a04-c4a95054d563\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:24:38.789314 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.789241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6078b8-c8e4-4e72-9a04-c4a95054d563-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-bltpx\" (UID: \"dd6078b8-c8e4-4e72-9a04-c4a95054d563\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:24:38.789661 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.789639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6078b8-c8e4-4e72-9a04-c4a95054d563-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-bltpx\" (UID: \"dd6078b8-c8e4-4e72-9a04-c4a95054d563\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:24:38.846427 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.846372 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:24:38.959463 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:38.959433 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx"] Apr 17 08:24:38.962457 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:24:38.962426 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6078b8_c8e4_4e72_9a04_c4a95054d563.slice/crio-47c77e426275ae9f891dab060ac95d37a2ddd87cc6ba6f7821879aa61cd9c087 WatchSource:0}: Error finding container 47c77e426275ae9f891dab060ac95d37a2ddd87cc6ba6f7821879aa61cd9c087: Status 404 returned error can't find the container with id 47c77e426275ae9f891dab060ac95d37a2ddd87cc6ba6f7821879aa61cd9c087 Apr 17 08:24:39.761875 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:39.761838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" event={"ID":"dd6078b8-c8e4-4e72-9a04-c4a95054d563","Type":"ContainerStarted","Data":"fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a"} Apr 17 08:24:39.761875 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:39.761875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" event={"ID":"dd6078b8-c8e4-4e72-9a04-c4a95054d563","Type":"ContainerStarted","Data":"47c77e426275ae9f891dab060ac95d37a2ddd87cc6ba6f7821879aa61cd9c087"} Apr 17 08:24:42.746022 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.746001 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:24:42.771102 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.771048 2573 generic.go:358] "Generic (PLEG): container finished" podID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerID="fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a" exitCode=0 Apr 17 08:24:42.771199 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.771121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" event={"ID":"dd6078b8-c8e4-4e72-9a04-c4a95054d563","Type":"ContainerDied","Data":"fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a"} Apr 17 08:24:42.772650 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.772631 2573 generic.go:358] "Generic (PLEG): container finished" podID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerID="f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7" exitCode=0 Apr 17 08:24:42.772728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.772673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" event={"ID":"c69e6b66-3dd9-4242-8b6f-f57e27e95555","Type":"ContainerDied","Data":"f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7"} Apr 17 08:24:42.772728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.772700 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" event={"ID":"c69e6b66-3dd9-4242-8b6f-f57e27e95555","Type":"ContainerDied","Data":"ffda463a840932101e556a31677d92e7db6cb7a59854be8ddda8ed43274e5891"} Apr 17 08:24:42.772728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.772704 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f" Apr 17 08:24:42.772728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.772717 2573 scope.go:117] "RemoveContainer" containerID="f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7" Apr 17 08:24:42.780311 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.780295 2573 scope.go:117] "RemoveContainer" containerID="eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af" Apr 17 08:24:42.790087 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.790054 2573 scope.go:117] "RemoveContainer" containerID="f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7" Apr 17 08:24:42.790504 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:24:42.790484 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7\": container with ID starting with f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7 not found: ID does not exist" containerID="f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7" Apr 17 08:24:42.790599 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.790509 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7"} err="failed to get container status \"f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7\": rpc error: code = NotFound desc = could not find container \"f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7\": container with ID starting with f2d2fe5c8fcfdcddf9cb3bfb7ee3861dc56dcbb877ec6a5afb102480a08464e7 not found: ID does not exist" Apr 17 08:24:42.790599 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.790527 2573 scope.go:117] "RemoveContainer" containerID="eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af" Apr 17 08:24:42.790794 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:24:42.790773 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af\": container with ID starting with eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af not found: ID does not exist" containerID="eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af" Apr 17 08:24:42.790855 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.790799 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af"} err="failed to get container status \"eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af\": rpc error: code = NotFound desc = could not find container \"eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af\": container with ID starting with eb04469617b891d86d92c55e9261384a0a475390371c0cfcf0e8098cb4aca6af not found: ID does not exist" Apr 17 08:24:42.914245 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.914224 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69e6b66-3dd9-4242-8b6f-f57e27e95555-kserve-provision-location\") pod \"c69e6b66-3dd9-4242-8b6f-f57e27e95555\" (UID: \"c69e6b66-3dd9-4242-8b6f-f57e27e95555\") " Apr 17 08:24:42.914537 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:42.914517 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69e6b66-3dd9-4242-8b6f-f57e27e95555-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c69e6b66-3dd9-4242-8b6f-f57e27e95555" (UID: "c69e6b66-3dd9-4242-8b6f-f57e27e95555"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:24:43.014694 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.014670 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69e6b66-3dd9-4242-8b6f-f57e27e95555-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:24:43.092093 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.092066 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f"] Apr 17 08:24:43.094331 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.094310 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-ggp5f"] Apr 17 08:24:43.162933 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.162905 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" path="/var/lib/kubelet/pods/c69e6b66-3dd9-4242-8b6f-f57e27e95555/volumes" Apr 17 08:24:43.776904 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.776869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" event={"ID":"dd6078b8-c8e4-4e72-9a04-c4a95054d563","Type":"ContainerStarted","Data":"f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c"} Apr 17 08:24:43.777364 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.777247 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:24:43.778444 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.778417 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:24:43.791914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:43.791873 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podStartSLOduration=5.791860307 podStartE2EDuration="5.791860307s" podCreationTimestamp="2026-04-17 08:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:24:43.791027964 +0000 UTC m=+2011.134279426" watchObservedRunningTime="2026-04-17 08:24:43.791860307 +0000 UTC m=+2011.135111770" Apr 17 08:24:44.780662 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:44.780623 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:24:54.781439 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:24:54.781366 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:25:04.781468 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:04.781425 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:25:14.781284 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:14.781237 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:25:24.781104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:24.781049 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:25:34.780962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:34.780870 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:25:44.781294 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:44.781247 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 08:25:54.782596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:54.782564 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:25:58.749584 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.749546 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx"] Apr 17 08:25:58.749980 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.749770 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" containerID="cri-o://f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c" gracePeriod=30 Apr 17 08:25:58.793421 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.793396 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9"] Apr 17 08:25:58.793630 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.793619 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" Apr 17 08:25:58.793677 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.793632 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" Apr 17 08:25:58.793677 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.793651 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="storage-initializer" Apr 17 08:25:58.793677 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.793658 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="storage-initializer" Apr 17 08:25:58.793767 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.793697 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c69e6b66-3dd9-4242-8b6f-f57e27e95555" containerName="kserve-container" Apr 17 08:25:58.796347 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.796329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:25:58.805597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.805575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9"] Apr 17 08:25:58.913157 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:58.913128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c543621-481a-4793-81f8-cce795ebecc9-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9\" (UID: \"4c543621-481a-4793-81f8-cce795ebecc9\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:25:59.013682 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:59.013612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c543621-481a-4793-81f8-cce795ebecc9-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9\" (UID: \"4c543621-481a-4793-81f8-cce795ebecc9\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:25:59.013922 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:59.013904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c543621-481a-4793-81f8-cce795ebecc9-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9\" (UID: \"4c543621-481a-4793-81f8-cce795ebecc9\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:25:59.106905 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:59.106886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:25:59.230802 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:59.230768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9"] Apr 17 08:25:59.234470 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:25:59.234437 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c543621_481a_4793_81f8_cce795ebecc9.slice/crio-cf85d1ef1d7e66213c345fbe4049723b6c987ff4d3da1079d687b01365f4235d WatchSource:0}: Error finding container cf85d1ef1d7e66213c345fbe4049723b6c987ff4d3da1079d687b01365f4235d: Status 404 returned error can't find the container with id cf85d1ef1d7e66213c345fbe4049723b6c987ff4d3da1079d687b01365f4235d Apr 17 08:25:59.973237 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:59.973201 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" event={"ID":"4c543621-481a-4793-81f8-cce795ebecc9","Type":"ContainerStarted","Data":"465f6bc0c75e31eab934d55f104d5b695c7116eed172e5f51f2e1402034bcb4c"} Apr 17 08:25:59.973237 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:25:59.973240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" event={"ID":"4c543621-481a-4793-81f8-cce795ebecc9","Type":"ContainerStarted","Data":"cf85d1ef1d7e66213c345fbe4049723b6c987ff4d3da1079d687b01365f4235d"} Apr 17 08:26:02.983330 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:02.983296 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c543621-481a-4793-81f8-cce795ebecc9" containerID="465f6bc0c75e31eab934d55f104d5b695c7116eed172e5f51f2e1402034bcb4c" exitCode=0 Apr 17 08:26:02.983690 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:02.983355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" event={"ID":"4c543621-481a-4793-81f8-cce795ebecc9","Type":"ContainerDied","Data":"465f6bc0c75e31eab934d55f104d5b695c7116eed172e5f51f2e1402034bcb4c"} Apr 17 08:26:03.972070 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.972050 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:26:03.987149 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.987115 2573 generic.go:358] "Generic (PLEG): container finished" podID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerID="f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c" exitCode=0 Apr 17 08:26:03.987460 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.987172 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" Apr 17 08:26:03.987460 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.987190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" event={"ID":"dd6078b8-c8e4-4e72-9a04-c4a95054d563","Type":"ContainerDied","Data":"f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c"} Apr 17 08:26:03.987460 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.987227 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx" event={"ID":"dd6078b8-c8e4-4e72-9a04-c4a95054d563","Type":"ContainerDied","Data":"47c77e426275ae9f891dab060ac95d37a2ddd87cc6ba6f7821879aa61cd9c087"} Apr 17 08:26:03.987460 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.987251 2573 scope.go:117] "RemoveContainer" containerID="f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c" Apr 17 08:26:03.988712 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.988683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" event={"ID":"4c543621-481a-4793-81f8-cce795ebecc9","Type":"ContainerStarted","Data":"9376d4df710dcf48078e224ed794e6e707f0a6848ae2b93fa48946e6c685a203"} Apr 17 08:26:03.988922 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.988904 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:26:03.995836 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:03.995820 2573 scope.go:117] "RemoveContainer" containerID="fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a" Apr 17 08:26:04.003692 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.003674 2573 scope.go:117] "RemoveContainer" containerID="f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c" Apr 17 08:26:04.003969 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:26:04.003933 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c\": container with ID starting with f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c not found: ID does not exist" containerID="f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c" Apr 17 08:26:04.004018 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.003983 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c"} err="failed to get container status \"f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c\": rpc error: code = NotFound desc = could not find container \"f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c\": container with ID starting with f6757d7e01d014aa193e0cf0b86b8781297ba56d938973acaa5ae8ae3e05a28c not found: ID does not exist" Apr 17 08:26:04.004018 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.004007 2573 scope.go:117] "RemoveContainer" containerID="fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a" Apr 17 08:26:04.004288 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:26:04.004270 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a\": container with ID starting with fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a not found: ID does not exist" containerID="fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a" Apr 17 08:26:04.004346 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.004293 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a"} err="failed to get container status \"fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a\": rpc error: code = NotFound desc = could not find container \"fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a\": container with ID starting with fb2b8df87a5bb13172eab0bd41cf98fc9a568e083f0b425932c91d700724d43a not found: ID does not exist" Apr 17 08:26:04.013857 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.013818 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" podStartSLOduration=6.013807374 podStartE2EDuration="6.013807374s" podCreationTimestamp="2026-04-17 08:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:26:04.013182041 +0000 UTC m=+2091.356433506" watchObservedRunningTime="2026-04-17 08:26:04.013807374 +0000 UTC m=+2091.357058836" Apr 17 08:26:04.048495 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.048476 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6078b8-c8e4-4e72-9a04-c4a95054d563-kserve-provision-location\") pod \"dd6078b8-c8e4-4e72-9a04-c4a95054d563\" (UID: \"dd6078b8-c8e4-4e72-9a04-c4a95054d563\") " Apr 17 08:26:04.048785 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.048761 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6078b8-c8e4-4e72-9a04-c4a95054d563-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd6078b8-c8e4-4e72-9a04-c4a95054d563" (UID: "dd6078b8-c8e4-4e72-9a04-c4a95054d563"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:26:04.149847 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.149792 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6078b8-c8e4-4e72-9a04-c4a95054d563-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:26:04.310885 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.310858 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx"] Apr 17 08:26:04.312627 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:04.312604 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-bltpx"] Apr 17 08:26:05.163183 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:05.163147 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" path="/var/lib/kubelet/pods/dd6078b8-c8e4-4e72-9a04-c4a95054d563/volumes" Apr 17 08:26:34.993783 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:34.993740 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.37:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 08:26:44.992956 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:44.992911 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.37:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 08:26:54.992050 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:26:54.992000 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.37:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 08:27:04.992885 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:04.992844 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.37:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 08:27:14.996683 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:14.996636 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:27:18.936840 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:18.936807 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9"] Apr 17 08:27:18.937330 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:18.937065 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" containerID="cri-o://9376d4df710dcf48078e224ed794e6e707f0a6848ae2b93fa48946e6c685a203" gracePeriod=30 Apr 17 08:27:19.004253 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.004221 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94"] Apr 17 08:27:19.004524 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.004510 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="storage-initializer" Apr 17 08:27:19.004571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.004526 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="storage-initializer" Apr 17 08:27:19.004571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.004539 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" Apr 17 08:27:19.004571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.004544 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" Apr 17 08:27:19.004661 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.004593 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd6078b8-c8e4-4e72-9a04-c4a95054d563" containerName="kserve-container" Apr 17 08:27:19.006230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.006216 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:27:19.019070 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.019047 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94"] Apr 17 08:27:19.139660 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.139627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af14c5c-a874-4c81-8e33-06dfcc24d2cc-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94\" (UID: \"8af14c5c-a874-4c81-8e33-06dfcc24d2cc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:27:19.240571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.240497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af14c5c-a874-4c81-8e33-06dfcc24d2cc-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94\" (UID: \"8af14c5c-a874-4c81-8e33-06dfcc24d2cc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:27:19.240892 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.240872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af14c5c-a874-4c81-8e33-06dfcc24d2cc-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94\" (UID: \"8af14c5c-a874-4c81-8e33-06dfcc24d2cc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:27:19.318080 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.318053 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:27:19.440270 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:19.440240 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94"] Apr 17 08:27:19.443442 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:27:19.443411 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8af14c5c_a874_4c81_8e33_06dfcc24d2cc.slice/crio-e02afbf4263d006e81e94b178a39ecde421f4e3e1ba0905a6bd6f331f29f3be3 WatchSource:0}: Error finding container e02afbf4263d006e81e94b178a39ecde421f4e3e1ba0905a6bd6f331f29f3be3: Status 404 returned error can't find the container with id e02afbf4263d006e81e94b178a39ecde421f4e3e1ba0905a6bd6f331f29f3be3 Apr 17 08:27:20.189215 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:20.189178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" event={"ID":"8af14c5c-a874-4c81-8e33-06dfcc24d2cc","Type":"ContainerStarted","Data":"abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29"} Apr 17 08:27:20.189215 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:20.189212 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" event={"ID":"8af14c5c-a874-4c81-8e33-06dfcc24d2cc","Type":"ContainerStarted","Data":"e02afbf4263d006e81e94b178a39ecde421f4e3e1ba0905a6bd6f331f29f3be3"} Apr 17 08:27:23.197937 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.197910 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c543621-481a-4793-81f8-cce795ebecc9" containerID="9376d4df710dcf48078e224ed794e6e707f0a6848ae2b93fa48946e6c685a203" exitCode=0 Apr 17 08:27:23.198284 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.197983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" event={"ID":"4c543621-481a-4793-81f8-cce795ebecc9","Type":"ContainerDied","Data":"9376d4df710dcf48078e224ed794e6e707f0a6848ae2b93fa48946e6c685a203"} Apr 17 08:27:23.199334 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.199315 2573 generic.go:358] "Generic (PLEG): container finished" podID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerID="abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29" exitCode=0 Apr 17 08:27:23.199441 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.199403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" event={"ID":"8af14c5c-a874-4c81-8e33-06dfcc24d2cc","Type":"ContainerDied","Data":"abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29"} Apr 17 08:27:23.277104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.277084 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:27:23.469628 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.469605 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c543621-481a-4793-81f8-cce795ebecc9-kserve-provision-location\") pod \"4c543621-481a-4793-81f8-cce795ebecc9\" (UID: \"4c543621-481a-4793-81f8-cce795ebecc9\") " Apr 17 08:27:23.469940 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.469919 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c543621-481a-4793-81f8-cce795ebecc9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4c543621-481a-4793-81f8-cce795ebecc9" (UID: "4c543621-481a-4793-81f8-cce795ebecc9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:27:23.570933 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:23.570912 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c543621-481a-4793-81f8-cce795ebecc9-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:27:24.203476 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.203440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" event={"ID":"4c543621-481a-4793-81f8-cce795ebecc9","Type":"ContainerDied","Data":"cf85d1ef1d7e66213c345fbe4049723b6c987ff4d3da1079d687b01365f4235d"} Apr 17 08:27:24.203907 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.203490 2573 scope.go:117] "RemoveContainer" containerID="9376d4df710dcf48078e224ed794e6e707f0a6848ae2b93fa48946e6c685a203" Apr 17 08:27:24.203907 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.203451 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9" Apr 17 08:27:24.205312 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.205285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" event={"ID":"8af14c5c-a874-4c81-8e33-06dfcc24d2cc","Type":"ContainerStarted","Data":"727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6"} Apr 17 08:27:24.205575 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.205559 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:27:24.215232 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.215212 2573 scope.go:117] "RemoveContainer" containerID="465f6bc0c75e31eab934d55f104d5b695c7116eed172e5f51f2e1402034bcb4c" Apr 17 08:27:24.226792 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.226749 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" podStartSLOduration=6.226736528 podStartE2EDuration="6.226736528s" podCreationTimestamp="2026-04-17 08:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:27:24.225278595 +0000 UTC m=+2171.568530058" watchObservedRunningTime="2026-04-17 08:27:24.226736528 +0000 UTC m=+2171.569987991" Apr 17 08:27:24.235682 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.235658 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9"] Apr 17 08:27:24.237211 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:24.237194 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-fhds9"] Apr 17 08:27:25.163394 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:25.163332 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c543621-481a-4793-81f8-cce795ebecc9" path="/var/lib/kubelet/pods/4c543621-481a-4793-81f8-cce795ebecc9/volumes" Apr 17 08:27:55.210258 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:27:55.210205 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.38:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 08:28:05.209949 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:05.209903 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.38:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 08:28:15.209591 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:15.209550 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.38:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 08:28:25.210174 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:25.210134 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.38:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 08:28:35.213577 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:35.213494 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:28:39.143448 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.143409 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94"] Apr 17 08:28:39.143794 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.143681 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" containerID="cri-o://727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6" gracePeriod=30 Apr 17 08:28:39.184761 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.184731 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh"] Apr 17 08:28:39.184983 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.184970 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" Apr 17 08:28:39.185036 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.184984 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" Apr 17 08:28:39.185036 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.185002 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="storage-initializer" Apr 17 08:28:39.185036 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.185007 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="storage-initializer" Apr 17 08:28:39.185124 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.185052 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c543621-481a-4793-81f8-cce795ebecc9" containerName="kserve-container" Apr 17 08:28:39.187874 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.187855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:28:39.194660 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.194632 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh"] Apr 17 08:28:39.264608 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.264587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85913f24-bc94-489e-b32c-b5f5dbe2f6df-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh\" (UID: \"85913f24-bc94-489e-b32c-b5f5dbe2f6df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:28:39.365641 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.365615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85913f24-bc94-489e-b32c-b5f5dbe2f6df-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh\" (UID: \"85913f24-bc94-489e-b32c-b5f5dbe2f6df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:28:39.365924 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.365909 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85913f24-bc94-489e-b32c-b5f5dbe2f6df-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh\" (UID: \"85913f24-bc94-489e-b32c-b5f5dbe2f6df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:28:39.498966 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.498929 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:28:39.611894 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.611833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh"] Apr 17 08:28:39.615448 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:28:39.615417 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85913f24_bc94_489e_b32c_b5f5dbe2f6df.slice/crio-cc1246180c3361b61686b125023af6b87d1525a66d451d3e3496e8be7418679e WatchSource:0}: Error finding container cc1246180c3361b61686b125023af6b87d1525a66d451d3e3496e8be7418679e: Status 404 returned error can't find the container with id cc1246180c3361b61686b125023af6b87d1525a66d451d3e3496e8be7418679e Apr 17 08:28:39.617265 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:39.617242 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:28:40.404150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:40.404110 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" event={"ID":"85913f24-bc94-489e-b32c-b5f5dbe2f6df","Type":"ContainerStarted","Data":"d8c879e9a7515f793736a41bf353664de9d3be84cd6983cfcc80a541d654e274"} Apr 17 08:28:40.404150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:40.404147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" event={"ID":"85913f24-bc94-489e-b32c-b5f5dbe2f6df","Type":"ContainerStarted","Data":"cc1246180c3361b61686b125023af6b87d1525a66d451d3e3496e8be7418679e"} Apr 17 08:28:43.373035 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.373002 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:28:43.413554 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.413521 2573 generic.go:358] "Generic (PLEG): container finished" podID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerID="727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6" exitCode=0 Apr 17 08:28:43.413667 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.413587 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" Apr 17 08:28:43.413667 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.413606 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" event={"ID":"8af14c5c-a874-4c81-8e33-06dfcc24d2cc","Type":"ContainerDied","Data":"727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6"} Apr 17 08:28:43.413667 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.413642 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94" event={"ID":"8af14c5c-a874-4c81-8e33-06dfcc24d2cc","Type":"ContainerDied","Data":"e02afbf4263d006e81e94b178a39ecde421f4e3e1ba0905a6bd6f331f29f3be3"} Apr 17 08:28:43.413667 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.413663 2573 scope.go:117] "RemoveContainer" containerID="727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6" Apr 17 08:28:43.414893 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.414871 2573 generic.go:358] "Generic (PLEG): container finished" podID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerID="d8c879e9a7515f793736a41bf353664de9d3be84cd6983cfcc80a541d654e274" exitCode=0 Apr 17 08:28:43.415006 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.414910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" event={"ID":"85913f24-bc94-489e-b32c-b5f5dbe2f6df","Type":"ContainerDied","Data":"d8c879e9a7515f793736a41bf353664de9d3be84cd6983cfcc80a541d654e274"} Apr 17 08:28:43.451559 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.451535 2573 scope.go:117] "RemoveContainer" containerID="abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29" Apr 17 08:28:43.472125 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.472092 2573 scope.go:117] "RemoveContainer" containerID="727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6" Apr 17 08:28:43.472539 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:28:43.472519 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6\": container with ID starting with 727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6 not found: ID does not exist" containerID="727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6" Apr 17 08:28:43.472624 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.472549 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6"} err="failed to get container status \"727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6\": rpc error: code = NotFound desc = could not find container \"727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6\": container with ID starting with 727e61b351b63d4a9f08800d16e22178772894099e2acf31713eb6331a01cea6 not found: ID does not exist" Apr 17 08:28:43.472624 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.472569 2573 scope.go:117] "RemoveContainer" containerID="abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29" Apr 17 08:28:43.472823 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:28:43.472806 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29\": container with ID starting with abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29 not found: ID does not exist" containerID="abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29" Apr 17 08:28:43.472890 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.472827 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29"} err="failed to get container status \"abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29\": rpc error: code = NotFound desc = could not find container \"abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29\": container with ID starting with abca54b045c811bd823af8f60c04b5e775e774d5f298a6c2aaa70c250f6abf29 not found: ID does not exist" Apr 17 08:28:43.493651 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.493633 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af14c5c-a874-4c81-8e33-06dfcc24d2cc-kserve-provision-location\") pod \"8af14c5c-a874-4c81-8e33-06dfcc24d2cc\" (UID: \"8af14c5c-a874-4c81-8e33-06dfcc24d2cc\") " Apr 17 08:28:43.493908 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.493889 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af14c5c-a874-4c81-8e33-06dfcc24d2cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8af14c5c-a874-4c81-8e33-06dfcc24d2cc" (UID: "8af14c5c-a874-4c81-8e33-06dfcc24d2cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:28:43.594014 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.593991 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af14c5c-a874-4c81-8e33-06dfcc24d2cc-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:28:43.732951 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.732908 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94"] Apr 17 08:28:43.733830 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:43.733812 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-sqp94"] Apr 17 08:28:44.419420 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:44.419364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" event={"ID":"85913f24-bc94-489e-b32c-b5f5dbe2f6df","Type":"ContainerStarted","Data":"814cfdff02e96518c18efeeaa78da652b0aa9c6cb3aabdfb577d3122ed87a855"} Apr 17 08:28:44.419845 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:44.419605 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:28:44.433594 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:44.433554 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podStartSLOduration=5.433541847 podStartE2EDuration="5.433541847s" podCreationTimestamp="2026-04-17 08:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:28:44.432564201 +0000 UTC m=+2251.775815675" watchObservedRunningTime="2026-04-17 08:28:44.433541847 +0000 UTC m=+2251.776793312" Apr 17 08:28:45.163796 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:28:45.163760 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" path="/var/lib/kubelet/pods/8af14c5c-a874-4c81-8e33-06dfcc24d2cc/volumes" Apr 17 08:29:15.424742 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:15.424703 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.39:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 08:29:25.423223 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:25.423181 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.39:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 08:29:35.424085 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:35.424044 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.39:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 08:29:45.423990 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:45.423949 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.39:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 08:29:48.160535 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:48.160494 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.39:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 08:29:58.164882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:58.164846 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:29:59.304961 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:59.304919 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh"] Apr 17 08:29:59.305457 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:29:59.305278 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" containerID="cri-o://814cfdff02e96518c18efeeaa78da652b0aa9c6cb3aabdfb577d3122ed87a855" gracePeriod=30 Apr 17 08:30:01.457445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.457410 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt"] Apr 17 08:30:01.457791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.457665 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="storage-initializer" Apr 17 08:30:01.457791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.457677 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="storage-initializer" Apr 17 08:30:01.457791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.457686 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" Apr 17 08:30:01.457791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.457692 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" Apr 17 08:30:01.457791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.457745 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8af14c5c-a874-4c81-8e33-06dfcc24d2cc" containerName="kserve-container" Apr 17 08:30:01.460542 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.460521 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:30:01.467249 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.467220 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt"] Apr 17 08:30:01.609968 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.609934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f952bc8c-ed71-44cf-81e1-5f88911bd21a-kserve-provision-location\") pod \"isvc-sklearn-predictor-67bb8f7cbc-v55kt\" (UID: \"f952bc8c-ed71-44cf-81e1-5f88911bd21a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:30:01.710997 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.710910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f952bc8c-ed71-44cf-81e1-5f88911bd21a-kserve-provision-location\") pod \"isvc-sklearn-predictor-67bb8f7cbc-v55kt\" (UID: \"f952bc8c-ed71-44cf-81e1-5f88911bd21a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:30:01.711258 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.711241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f952bc8c-ed71-44cf-81e1-5f88911bd21a-kserve-provision-location\") pod \"isvc-sklearn-predictor-67bb8f7cbc-v55kt\" (UID: \"f952bc8c-ed71-44cf-81e1-5f88911bd21a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:30:01.771649 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.771619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:30:01.887839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:01.887816 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt"] Apr 17 08:30:01.890683 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:30:01.890650 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf952bc8c_ed71_44cf_81e1_5f88911bd21a.slice/crio-d83e7cb1923369d005403580c3ef688cfac85ac50fe43d5d44ef38cd983ac8d6 WatchSource:0}: Error finding container d83e7cb1923369d005403580c3ef688cfac85ac50fe43d5d44ef38cd983ac8d6: Status 404 returned error can't find the container with id d83e7cb1923369d005403580c3ef688cfac85ac50fe43d5d44ef38cd983ac8d6 Apr 17 08:30:02.620814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:02.620783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" event={"ID":"f952bc8c-ed71-44cf-81e1-5f88911bd21a","Type":"ContainerStarted","Data":"c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b"} Apr 17 08:30:02.620814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:02.620817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" event={"ID":"f952bc8c-ed71-44cf-81e1-5f88911bd21a","Type":"ContainerStarted","Data":"d83e7cb1923369d005403580c3ef688cfac85ac50fe43d5d44ef38cd983ac8d6"} Apr 17 08:30:04.627217 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:04.627187 2573 generic.go:358] "Generic (PLEG): container finished" podID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerID="814cfdff02e96518c18efeeaa78da652b0aa9c6cb3aabdfb577d3122ed87a855" exitCode=0 Apr 17 08:30:04.627547 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:04.627257 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" event={"ID":"85913f24-bc94-489e-b32c-b5f5dbe2f6df","Type":"ContainerDied","Data":"814cfdff02e96518c18efeeaa78da652b0aa9c6cb3aabdfb577d3122ed87a855"} Apr 17 08:30:04.747527 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:04.747502 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:30:04.833229 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:04.833208 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85913f24-bc94-489e-b32c-b5f5dbe2f6df-kserve-provision-location\") pod \"85913f24-bc94-489e-b32c-b5f5dbe2f6df\" (UID: \"85913f24-bc94-489e-b32c-b5f5dbe2f6df\") " Apr 17 08:30:04.833541 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:04.833519 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85913f24-bc94-489e-b32c-b5f5dbe2f6df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "85913f24-bc94-489e-b32c-b5f5dbe2f6df" (UID: "85913f24-bc94-489e-b32c-b5f5dbe2f6df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:30:04.934280 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:04.934255 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85913f24-bc94-489e-b32c-b5f5dbe2f6df-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:30:05.631710 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.631676 2573 generic.go:358] "Generic (PLEG): container finished" podID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerID="c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b" exitCode=0 Apr 17 08:30:05.632101 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.631744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" event={"ID":"f952bc8c-ed71-44cf-81e1-5f88911bd21a","Type":"ContainerDied","Data":"c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b"} Apr 17 08:30:05.633411 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.633368 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" event={"ID":"85913f24-bc94-489e-b32c-b5f5dbe2f6df","Type":"ContainerDied","Data":"cc1246180c3361b61686b125023af6b87d1525a66d451d3e3496e8be7418679e"} Apr 17 08:30:05.633505 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.633425 2573 scope.go:117] "RemoveContainer" containerID="814cfdff02e96518c18efeeaa78da652b0aa9c6cb3aabdfb577d3122ed87a855" Apr 17 08:30:05.633505 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.633431 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh" Apr 17 08:30:05.641195 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.641175 2573 scope.go:117] "RemoveContainer" containerID="d8c879e9a7515f793736a41bf353664de9d3be84cd6983cfcc80a541d654e274" Apr 17 08:30:05.657501 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.657479 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh"] Apr 17 08:30:05.658702 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:05.658681 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-778gh"] Apr 17 08:30:06.637938 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:06.637901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" event={"ID":"f952bc8c-ed71-44cf-81e1-5f88911bd21a","Type":"ContainerStarted","Data":"8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35"} Apr 17 08:30:06.638439 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:06.638224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:30:06.639475 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:06.639446 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:30:06.653296 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:06.653221 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podStartSLOduration=5.65320539 podStartE2EDuration="5.65320539s" podCreationTimestamp="2026-04-17 08:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:30:06.652228616 +0000 UTC m=+2333.995480079" watchObservedRunningTime="2026-04-17 08:30:06.65320539 +0000 UTC m=+2333.996456854" Apr 17 08:30:07.163218 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:07.163180 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" path="/var/lib/kubelet/pods/85913f24-bc94-489e-b32c-b5f5dbe2f6df/volumes" Apr 17 08:30:07.641352 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:07.641307 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:30:17.642187 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:17.642142 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:30:27.642140 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:27.642102 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:30:37.641718 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:37.641675 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:30:47.642173 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:47.642129 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:30:57.641992 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:30:57.641945 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 08:31:07.642637 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:07.642592 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:31:11.575560 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.575528 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt"] Apr 17 08:31:11.576002 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.575827 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" containerID="cri-o://8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35" gracePeriod=30 Apr 17 08:31:11.624261 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.624231 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc"] Apr 17 08:31:11.624498 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.624485 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" Apr 17 08:31:11.624548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.624500 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" Apr 17 08:31:11.624548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.624515 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="storage-initializer" Apr 17 08:31:11.624548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.624521 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="storage-initializer" Apr 17 08:31:11.624642 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.624561 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="85913f24-bc94-489e-b32c-b5f5dbe2f6df" containerName="kserve-container" Apr 17 08:31:11.627313 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.627295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:31:11.635956 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.635934 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc"] Apr 17 08:31:11.688834 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.688805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa700c3b-ad09-41e2-afc7-7649d125e4e1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-rvthc\" (UID: \"fa700c3b-ad09-41e2-afc7-7649d125e4e1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:31:11.789118 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.789094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa700c3b-ad09-41e2-afc7-7649d125e4e1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-rvthc\" (UID: \"fa700c3b-ad09-41e2-afc7-7649d125e4e1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:31:11.789465 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.789449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa700c3b-ad09-41e2-afc7-7649d125e4e1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-rvthc\" (UID: \"fa700c3b-ad09-41e2-afc7-7649d125e4e1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:31:11.937827 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:11.937803 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:31:12.050640 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:12.050611 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc"] Apr 17 08:31:12.053676 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:31:12.053643 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa700c3b_ad09_41e2_afc7_7649d125e4e1.slice/crio-4eb7c8ed06ca15b3f363e50faa53971ae5ea61c820857492ee237d871090189a WatchSource:0}: Error finding container 4eb7c8ed06ca15b3f363e50faa53971ae5ea61c820857492ee237d871090189a: Status 404 returned error can't find the container with id 4eb7c8ed06ca15b3f363e50faa53971ae5ea61c820857492ee237d871090189a Apr 17 08:31:12.813230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:12.813193 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" event={"ID":"fa700c3b-ad09-41e2-afc7-7649d125e4e1","Type":"ContainerStarted","Data":"652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283"} Apr 17 08:31:12.813230 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:12.813231 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" event={"ID":"fa700c3b-ad09-41e2-afc7-7649d125e4e1","Type":"ContainerStarted","Data":"4eb7c8ed06ca15b3f363e50faa53971ae5ea61c820857492ee237d871090189a"} Apr 17 08:31:15.507411 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.507389 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:31:15.618250 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.618188 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f952bc8c-ed71-44cf-81e1-5f88911bd21a-kserve-provision-location\") pod \"f952bc8c-ed71-44cf-81e1-5f88911bd21a\" (UID: \"f952bc8c-ed71-44cf-81e1-5f88911bd21a\") " Apr 17 08:31:15.618520 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.618499 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f952bc8c-ed71-44cf-81e1-5f88911bd21a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f952bc8c-ed71-44cf-81e1-5f88911bd21a" (UID: "f952bc8c-ed71-44cf-81e1-5f88911bd21a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:31:15.718904 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.718882 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f952bc8c-ed71-44cf-81e1-5f88911bd21a-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:31:15.820945 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.820919 2573 generic.go:358] "Generic (PLEG): container finished" podID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerID="8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35" exitCode=0 Apr 17 08:31:15.821072 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.820982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" Apr 17 08:31:15.821072 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.821005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" event={"ID":"f952bc8c-ed71-44cf-81e1-5f88911bd21a","Type":"ContainerDied","Data":"8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35"} Apr 17 08:31:15.821072 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.821053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt" event={"ID":"f952bc8c-ed71-44cf-81e1-5f88911bd21a","Type":"ContainerDied","Data":"d83e7cb1923369d005403580c3ef688cfac85ac50fe43d5d44ef38cd983ac8d6"} Apr 17 08:31:15.821248 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.821074 2573 scope.go:117] "RemoveContainer" containerID="8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35" Apr 17 08:31:15.822431 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.822408 2573 generic.go:358] "Generic (PLEG): container finished" podID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerID="652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283" exitCode=0 Apr 17 08:31:15.822548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.822432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" event={"ID":"fa700c3b-ad09-41e2-afc7-7649d125e4e1","Type":"ContainerDied","Data":"652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283"} Apr 17 08:31:15.828668 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.828552 2573 scope.go:117] "RemoveContainer" containerID="c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b" Apr 17 08:31:15.835559 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.835543 2573 scope.go:117] "RemoveContainer" containerID="8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35" Apr 17 08:31:15.835862 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:31:15.835844 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35\": container with ID starting with 8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35 not found: ID does not exist" containerID="8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35" Apr 17 08:31:15.835920 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.835870 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35"} err="failed to get container status \"8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35\": rpc error: code = NotFound desc = could not find container \"8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35\": container with ID starting with 8e3db2a7e0a5fe0d786e89bb6fad5d3920134a3912359cc34dd32c960d086e35 not found: ID does not exist" Apr 17 08:31:15.835920 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.835890 2573 scope.go:117] "RemoveContainer" containerID="c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b" Apr 17 08:31:15.836111 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:31:15.836097 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b\": container with ID starting with c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b not found: ID does not exist" containerID="c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b" Apr 17 08:31:15.836154 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.836114 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b"} err="failed to get container status \"c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b\": rpc error: code = NotFound desc = could not find container \"c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b\": container with ID starting with c47b5a6cab597501bdd1bb4d9b46886aec770e98de603c15d6497aac03f8140b not found: ID does not exist" Apr 17 08:31:15.855058 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.855038 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt"] Apr 17 08:31:15.857143 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:15.857123 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-v55kt"] Apr 17 08:31:16.827290 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:16.827253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" event={"ID":"fa700c3b-ad09-41e2-afc7-7649d125e4e1","Type":"ContainerStarted","Data":"ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e"} Apr 17 08:31:16.827670 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:16.827471 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:31:16.843476 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:16.843431 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" podStartSLOduration=5.843418892 podStartE2EDuration="5.843418892s" podCreationTimestamp="2026-04-17 08:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:31:16.841407067 +0000 UTC m=+2404.184658527" watchObservedRunningTime="2026-04-17 08:31:16.843418892 +0000 UTC m=+2404.186670355" Apr 17 08:31:17.163170 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:17.163080 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" path="/var/lib/kubelet/pods/f952bc8c-ed71-44cf-81e1-5f88911bd21a/volumes" Apr 17 08:31:47.902293 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:47.902206 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 17 08:31:57.833005 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:31:57.832972 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:32:01.783131 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.783099 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc"] Apr 17 08:32:01.783561 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.783338 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="kserve-container" containerID="cri-o://ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e" gracePeriod=30 Apr 17 08:32:01.805831 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.805804 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2"] Apr 17 08:32:01.806054 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.806041 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="storage-initializer" Apr 17 08:32:01.806054 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.806055 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="storage-initializer" Apr 17 08:32:01.806182 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.806066 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" Apr 17 08:32:01.806182 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.806071 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" Apr 17 08:32:01.806182 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.806114 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f952bc8c-ed71-44cf-81e1-5f88911bd21a" containerName="kserve-container" Apr 17 08:32:01.809098 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.809078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:01.815614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.815595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2"] Apr 17 08:32:01.829292 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.829267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34b62116-f512-4dd2-bdd2-69b56952f38b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-667495d4b9-j6zt2\" (UID: \"34b62116-f512-4dd2-bdd2-69b56952f38b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:01.930275 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.930242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34b62116-f512-4dd2-bdd2-69b56952f38b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-667495d4b9-j6zt2\" (UID: \"34b62116-f512-4dd2-bdd2-69b56952f38b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:01.930591 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:01.930574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34b62116-f512-4dd2-bdd2-69b56952f38b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-667495d4b9-j6zt2\" (UID: \"34b62116-f512-4dd2-bdd2-69b56952f38b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:02.119953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:02.119842 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:02.236577 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:02.236511 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2"] Apr 17 08:32:02.239094 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:32:02.239059 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b62116_f512_4dd2_bdd2_69b56952f38b.slice/crio-1e908f437e9cc3dc4d52503de81d6966e9cdc8f1401292ac8034435d8c6f549e WatchSource:0}: Error finding container 1e908f437e9cc3dc4d52503de81d6966e9cdc8f1401292ac8034435d8c6f549e: Status 404 returned error can't find the container with id 1e908f437e9cc3dc4d52503de81d6966e9cdc8f1401292ac8034435d8c6f549e Apr 17 08:32:02.948452 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:02.948420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" event={"ID":"34b62116-f512-4dd2-bdd2-69b56952f38b","Type":"ContainerStarted","Data":"58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220"} Apr 17 08:32:02.948452 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:02.948454 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" event={"ID":"34b62116-f512-4dd2-bdd2-69b56952f38b","Type":"ContainerStarted","Data":"1e908f437e9cc3dc4d52503de81d6966e9cdc8f1401292ac8034435d8c6f549e"} Apr 17 08:32:07.831167 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:07.831121 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.41:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 08:32:07.963776 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:07.963742 2573 generic.go:358] "Generic (PLEG): container finished" podID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerID="58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220" exitCode=0 Apr 17 08:32:07.963930 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:07.963808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" event={"ID":"34b62116-f512-4dd2-bdd2-69b56952f38b","Type":"ContainerDied","Data":"58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220"} Apr 17 08:32:08.524586 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.524560 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:32:08.569040 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.568977 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa700c3b-ad09-41e2-afc7-7649d125e4e1-kserve-provision-location\") pod \"fa700c3b-ad09-41e2-afc7-7649d125e4e1\" (UID: \"fa700c3b-ad09-41e2-afc7-7649d125e4e1\") " Apr 17 08:32:08.569311 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.569283 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa700c3b-ad09-41e2-afc7-7649d125e4e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fa700c3b-ad09-41e2-afc7-7649d125e4e1" (UID: "fa700c3b-ad09-41e2-afc7-7649d125e4e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:32:08.669339 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.669311 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa700c3b-ad09-41e2-afc7-7649d125e4e1-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:32:08.968449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.968413 2573 generic.go:358] "Generic (PLEG): container finished" podID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerID="ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e" exitCode=0 Apr 17 08:32:08.968914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.968494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" event={"ID":"fa700c3b-ad09-41e2-afc7-7649d125e4e1","Type":"ContainerDied","Data":"ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e"} Apr 17 08:32:08.968914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.968524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" event={"ID":"fa700c3b-ad09-41e2-afc7-7649d125e4e1","Type":"ContainerDied","Data":"4eb7c8ed06ca15b3f363e50faa53971ae5ea61c820857492ee237d871090189a"} Apr 17 08:32:08.968914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.968546 2573 scope.go:117] "RemoveContainer" containerID="ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e" Apr 17 08:32:08.968914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.968547 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc" Apr 17 08:32:08.970330 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.970301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" event={"ID":"34b62116-f512-4dd2-bdd2-69b56952f38b","Type":"ContainerStarted","Data":"e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9"} Apr 17 08:32:08.970608 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.970580 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:08.971826 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.971798 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 17 08:32:08.978025 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.977886 2573 scope.go:117] "RemoveContainer" containerID="652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283" Apr 17 08:32:08.986696 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.986680 2573 scope.go:117] "RemoveContainer" containerID="ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e" Apr 17 08:32:08.986942 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:32:08.986925 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e\": container with ID starting with ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e not found: ID does not exist" containerID="ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e" Apr 17 08:32:08.986997 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.986949 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e"} err="failed to get container status \"ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e\": rpc error: code = NotFound desc = could not find container \"ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e\": container with ID starting with ae4ed3aef0728379293629f36081d8f956354165684bbd210c6b5f7f15b37a8e not found: ID does not exist" Apr 17 08:32:08.986997 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.986964 2573 scope.go:117] "RemoveContainer" containerID="652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283" Apr 17 08:32:08.987170 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:32:08.987155 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283\": container with ID starting with 652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283 not found: ID does not exist" containerID="652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283" Apr 17 08:32:08.987214 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.987174 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283"} err="failed to get container status \"652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283\": rpc error: code = NotFound desc = could not find container \"652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283\": container with ID starting with 652bcca9417b45fa8d62d1e535148c0d1dcf00f4a6f7be640ee013778609f283 not found: ID does not exist" Apr 17 08:32:08.991827 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:08.991681 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" podStartSLOduration=7.991669992 podStartE2EDuration="7.991669992s" podCreationTimestamp="2026-04-17 08:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:32:08.991335365 +0000 UTC m=+2456.334586829" watchObservedRunningTime="2026-04-17 08:32:08.991669992 +0000 UTC m=+2456.334921456" Apr 17 08:32:09.004924 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:09.004903 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc"] Apr 17 08:32:09.008209 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:09.008191 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-rvthc"] Apr 17 08:32:09.166627 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:09.166598 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" path="/var/lib/kubelet/pods/fa700c3b-ad09-41e2-afc7-7649d125e4e1/volumes" Apr 17 08:32:09.974949 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:09.974895 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 17 08:32:19.975698 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:19.975645 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 17 08:32:29.975672 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:29.975642 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:38.761585 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.761541 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-667495d4b9-j6zt2_34b62116-f512-4dd2-bdd2-69b56952f38b/kserve-container/0.log" Apr 17 08:32:38.936206 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.936175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2"] Apr 17 08:32:38.936515 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.936464 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" containerID="cri-o://e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9" gracePeriod=30 Apr 17 08:32:38.959574 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.959548 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f"] Apr 17 08:32:38.959792 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.959781 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="storage-initializer" Apr 17 08:32:38.959839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.959794 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="storage-initializer" Apr 17 08:32:38.959839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.959806 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="kserve-container" Apr 17 08:32:38.959839 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.959812 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="kserve-container" Apr 17 08:32:38.959934 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.959855 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa700c3b-ad09-41e2-afc7-7649d125e4e1" containerName="kserve-container" Apr 17 08:32:38.962646 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.962631 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:32:38.969730 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:38.969688 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f"] Apr 17 08:32:39.079029 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.078963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f28b4e4-26d1-497c-8a54-6788a80a1b6b-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f\" (UID: \"5f28b4e4-26d1-497c-8a54-6788a80a1b6b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:32:39.179926 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.179899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f28b4e4-26d1-497c-8a54-6788a80a1b6b-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f\" (UID: \"5f28b4e4-26d1-497c-8a54-6788a80a1b6b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:32:39.180212 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.180195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f28b4e4-26d1-497c-8a54-6788a80a1b6b-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f\" (UID: \"5f28b4e4-26d1-497c-8a54-6788a80a1b6b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:32:39.274226 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.274191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:32:39.400351 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.400186 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f"] Apr 17 08:32:39.404460 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:32:39.404430 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f28b4e4_26d1_497c_8a54_6788a80a1b6b.slice/crio-97aec5607cb38fdd0b587fbb210bc5ea9590b5bcfbfd2fdfaaa5c5dc719fa89b WatchSource:0}: Error finding container 97aec5607cb38fdd0b587fbb210bc5ea9590b5bcfbfd2fdfaaa5c5dc719fa89b: Status 404 returned error can't find the container with id 97aec5607cb38fdd0b587fbb210bc5ea9590b5bcfbfd2fdfaaa5c5dc719fa89b Apr 17 08:32:39.777135 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.777113 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:39.887716 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.887642 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34b62116-f512-4dd2-bdd2-69b56952f38b-kserve-provision-location\") pod \"34b62116-f512-4dd2-bdd2-69b56952f38b\" (UID: \"34b62116-f512-4dd2-bdd2-69b56952f38b\") " Apr 17 08:32:39.914482 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.914436 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b62116-f512-4dd2-bdd2-69b56952f38b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34b62116-f512-4dd2-bdd2-69b56952f38b" (UID: "34b62116-f512-4dd2-bdd2-69b56952f38b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:32:39.988862 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:39.988830 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34b62116-f512-4dd2-bdd2-69b56952f38b-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:32:40.054287 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.054258 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" event={"ID":"5f28b4e4-26d1-497c-8a54-6788a80a1b6b","Type":"ContainerStarted","Data":"d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956"} Apr 17 08:32:40.054451 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.054298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" event={"ID":"5f28b4e4-26d1-497c-8a54-6788a80a1b6b","Type":"ContainerStarted","Data":"97aec5607cb38fdd0b587fbb210bc5ea9590b5bcfbfd2fdfaaa5c5dc719fa89b"} Apr 17 08:32:40.055516 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.055493 2573 generic.go:358] "Generic (PLEG): container finished" podID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerID="e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9" exitCode=0 Apr 17 08:32:40.055606 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.055525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" event={"ID":"34b62116-f512-4dd2-bdd2-69b56952f38b","Type":"ContainerDied","Data":"e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9"} Apr 17 08:32:40.055606 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.055542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" event={"ID":"34b62116-f512-4dd2-bdd2-69b56952f38b","Type":"ContainerDied","Data":"1e908f437e9cc3dc4d52503de81d6966e9cdc8f1401292ac8034435d8c6f549e"} Apr 17 08:32:40.055606 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.055545 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2" Apr 17 08:32:40.055606 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.055557 2573 scope.go:117] "RemoveContainer" containerID="e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9" Apr 17 08:32:40.062947 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.062899 2573 scope.go:117] "RemoveContainer" containerID="58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220" Apr 17 08:32:40.069961 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.069865 2573 scope.go:117] "RemoveContainer" containerID="e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9" Apr 17 08:32:40.070276 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:32:40.070227 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9\": container with ID starting with e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9 not found: ID does not exist" containerID="e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9" Apr 17 08:32:40.070276 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.070253 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9"} err="failed to get container status \"e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9\": rpc error: code = NotFound desc = could not find container \"e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9\": container with ID starting with e57cdcac6a25f2a6e68b71d92796eeca5581507213a3214aaedbcfd79ace40f9 not found: ID does not exist" Apr 17 08:32:40.070276 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.070271 2573 scope.go:117] "RemoveContainer" containerID="58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220" Apr 17 08:32:40.070586 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:32:40.070570 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220\": container with ID starting with 58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220 not found: ID does not exist" containerID="58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220" Apr 17 08:32:40.070631 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.070598 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220"} err="failed to get container status \"58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220\": rpc error: code = NotFound desc = could not find container \"58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220\": container with ID starting with 58b94e620eb8c0048cdb26d6ea413e042b4e4302933c8e937ffd62d5e27c2220 not found: ID does not exist" Apr 17 08:32:40.082633 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.082608 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2"] Apr 17 08:32:40.086429 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:40.086407 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-j6zt2"] Apr 17 08:32:41.163532 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:41.163502 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" path="/var/lib/kubelet/pods/34b62116-f512-4dd2-bdd2-69b56952f38b/volumes" Apr 17 08:32:44.069912 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:44.069868 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerID="d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956" exitCode=0 Apr 17 08:32:44.070340 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:44.069950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" event={"ID":"5f28b4e4-26d1-497c-8a54-6788a80a1b6b","Type":"ContainerDied","Data":"d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956"} Apr 17 08:32:45.074252 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:45.074216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" event={"ID":"5f28b4e4-26d1-497c-8a54-6788a80a1b6b","Type":"ContainerStarted","Data":"9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73"} Apr 17 08:32:45.074693 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:45.074450 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:32:45.091549 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:32:45.091506 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" podStartSLOduration=7.09149358 podStartE2EDuration="7.09149358s" podCreationTimestamp="2026-04-17 08:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:32:45.089686855 +0000 UTC m=+2492.432938318" watchObservedRunningTime="2026-04-17 08:32:45.09149358 +0000 UTC m=+2492.434745042" Apr 17 08:33:16.102136 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:16.102090 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 17 08:33:26.079508 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:26.079479 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:33:29.046405 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.046348 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f"] Apr 17 08:33:29.046882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.046681 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="kserve-container" containerID="cri-o://9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73" gracePeriod=30 Apr 17 08:33:29.095476 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.095447 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26"] Apr 17 08:33:29.095714 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.095702 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="storage-initializer" Apr 17 08:33:29.095762 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.095716 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="storage-initializer" Apr 17 08:33:29.095762 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.095726 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" Apr 17 08:33:29.095762 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.095733 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" Apr 17 08:33:29.095858 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.095785 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="34b62116-f512-4dd2-bdd2-69b56952f38b" containerName="kserve-container" Apr 17 08:33:29.098603 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.098585 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:33:29.106931 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.106909 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26"] Apr 17 08:33:29.234423 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.234367 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/289f51f8-d645-4ecb-8009-ec11650efe99-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c6977494c-phs26\" (UID: \"289f51f8-d645-4ecb-8009-ec11650efe99\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:33:29.334780 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.334703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/289f51f8-d645-4ecb-8009-ec11650efe99-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c6977494c-phs26\" (UID: \"289f51f8-d645-4ecb-8009-ec11650efe99\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:33:29.335062 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.335042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/289f51f8-d645-4ecb-8009-ec11650efe99-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c6977494c-phs26\" (UID: \"289f51f8-d645-4ecb-8009-ec11650efe99\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:33:29.409482 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.409455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:33:29.523708 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:29.523565 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26"] Apr 17 08:33:29.526427 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:33:29.526398 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289f51f8_d645_4ecb_8009_ec11650efe99.slice/crio-a6f5cd6c54016692a3ae4dc2e191768d4de42d04437a4ac0c1a6789d217fc7c3 WatchSource:0}: Error finding container a6f5cd6c54016692a3ae4dc2e191768d4de42d04437a4ac0c1a6789d217fc7c3: Status 404 returned error can't find the container with id a6f5cd6c54016692a3ae4dc2e191768d4de42d04437a4ac0c1a6789d217fc7c3 Apr 17 08:33:30.189781 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:30.189748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" event={"ID":"289f51f8-d645-4ecb-8009-ec11650efe99","Type":"ContainerStarted","Data":"2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5"} Apr 17 08:33:30.189781 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:30.189782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" event={"ID":"289f51f8-d645-4ecb-8009-ec11650efe99","Type":"ContainerStarted","Data":"a6f5cd6c54016692a3ae4dc2e191768d4de42d04437a4ac0c1a6789d217fc7c3"} Apr 17 08:33:33.199112 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:33.199085 2573 generic.go:358] "Generic (PLEG): container finished" podID="289f51f8-d645-4ecb-8009-ec11650efe99" containerID="2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5" exitCode=0 Apr 17 08:33:33.199455 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:33.199136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" event={"ID":"289f51f8-d645-4ecb-8009-ec11650efe99","Type":"ContainerDied","Data":"2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5"} Apr 17 08:33:34.204000 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:34.203962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" event={"ID":"289f51f8-d645-4ecb-8009-ec11650efe99","Type":"ContainerStarted","Data":"bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0"} Apr 17 08:33:34.204473 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:34.204280 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:33:34.205854 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:34.205828 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:33:34.219006 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:34.218967 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podStartSLOduration=5.218953757 podStartE2EDuration="5.218953757s" podCreationTimestamp="2026-04-17 08:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:33:34.217792927 +0000 UTC m=+2541.561044388" watchObservedRunningTime="2026-04-17 08:33:34.218953757 +0000 UTC m=+2541.562205220" Apr 17 08:33:35.207290 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:35.207254 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:33:35.584430 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:35.584410 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:33:35.684061 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:35.684029 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f28b4e4-26d1-497c-8a54-6788a80a1b6b-kserve-provision-location\") pod \"5f28b4e4-26d1-497c-8a54-6788a80a1b6b\" (UID: \"5f28b4e4-26d1-497c-8a54-6788a80a1b6b\") " Apr 17 08:33:35.684332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:35.684311 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f28b4e4-26d1-497c-8a54-6788a80a1b6b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5f28b4e4-26d1-497c-8a54-6788a80a1b6b" (UID: "5f28b4e4-26d1-497c-8a54-6788a80a1b6b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:33:35.785104 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:35.785040 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f28b4e4-26d1-497c-8a54-6788a80a1b6b-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:33:36.211172 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.211140 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerID="9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73" exitCode=0 Apr 17 08:33:36.211596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.211225 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" Apr 17 08:33:36.211596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.211222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" event={"ID":"5f28b4e4-26d1-497c-8a54-6788a80a1b6b","Type":"ContainerDied","Data":"9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73"} Apr 17 08:33:36.211596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.211332 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f" event={"ID":"5f28b4e4-26d1-497c-8a54-6788a80a1b6b","Type":"ContainerDied","Data":"97aec5607cb38fdd0b587fbb210bc5ea9590b5bcfbfd2fdfaaa5c5dc719fa89b"} Apr 17 08:33:36.211596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.211347 2573 scope.go:117] "RemoveContainer" containerID="9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73" Apr 17 08:33:36.219270 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.219255 2573 scope.go:117] "RemoveContainer" containerID="d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956" Apr 17 08:33:36.225732 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.225716 2573 scope.go:117] "RemoveContainer" containerID="9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73" Apr 17 08:33:36.225968 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:33:36.225949 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73\": container with ID starting with 9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73 not found: ID does not exist" containerID="9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73" Apr 17 08:33:36.226027 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.225977 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73"} err="failed to get container status \"9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73\": rpc error: code = NotFound desc = could not find container \"9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73\": container with ID starting with 9a6a331b347f816155f0beac571f691effe0ac8294794f16236fcac5e1105a73 not found: ID does not exist" Apr 17 08:33:36.226027 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.225996 2573 scope.go:117] "RemoveContainer" containerID="d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956" Apr 17 08:33:36.226236 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:33:36.226221 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956\": container with ID starting with d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956 not found: ID does not exist" containerID="d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956" Apr 17 08:33:36.226282 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.226240 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956"} err="failed to get container status \"d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956\": rpc error: code = NotFound desc = could not find container \"d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956\": container with ID starting with d6ba4c985a08fc374da78bce0f94c71e1d6ab1f5cabe811518cfc1b3aed8d956 not found: ID does not exist" Apr 17 08:33:36.230997 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.230978 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f"] Apr 17 08:33:36.235770 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:36.235751 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-lwg7f"] Apr 17 08:33:37.163339 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:37.163304 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" path="/var/lib/kubelet/pods/5f28b4e4-26d1-497c-8a54-6788a80a1b6b/volumes" Apr 17 08:33:45.207222 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:45.207177 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:33:55.208164 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:33:55.208121 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:34:05.207181 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:05.207144 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:34:15.207375 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:15.207337 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:34:25.207992 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:25.207950 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 08:34:35.208084 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:35.208010 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:34:39.293177 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.293141 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26"] Apr 17 08:34:39.293597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.293419 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" containerID="cri-o://bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0" gracePeriod=30 Apr 17 08:34:39.366257 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.366226 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6"] Apr 17 08:34:39.366510 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.366498 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="kserve-container" Apr 17 08:34:39.366566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.366513 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="kserve-container" Apr 17 08:34:39.366566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.366521 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="storage-initializer" Apr 17 08:34:39.366566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.366527 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="storage-initializer" Apr 17 08:34:39.366566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.366565 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f28b4e4-26d1-497c-8a54-6788a80a1b6b" containerName="kserve-container" Apr 17 08:34:39.369443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.369423 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:34:39.376347 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.376325 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6"] Apr 17 08:34:39.397778 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.397756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6b6eb45-725d-42ab-9865-6651f2462862-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6\" (UID: \"c6b6eb45-725d-42ab-9865-6651f2462862\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:34:39.498760 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.498727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6b6eb45-725d-42ab-9865-6651f2462862-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6\" (UID: \"c6b6eb45-725d-42ab-9865-6651f2462862\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:34:39.499099 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.499079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6b6eb45-725d-42ab-9865-6651f2462862-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6\" (UID: \"c6b6eb45-725d-42ab-9865-6651f2462862\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:34:39.679776 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.679709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:34:39.793562 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.793541 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6"] Apr 17 08:34:39.795809 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:34:39.795780 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6b6eb45_725d_42ab_9865_6651f2462862.slice/crio-5f7c2fbb356ebcbaaf041f76a7865a97c72fdb654eb635cd06ccb15afc479676 WatchSource:0}: Error finding container 5f7c2fbb356ebcbaaf041f76a7865a97c72fdb654eb635cd06ccb15afc479676: Status 404 returned error can't find the container with id 5f7c2fbb356ebcbaaf041f76a7865a97c72fdb654eb635cd06ccb15afc479676 Apr 17 08:34:39.797814 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:39.797796 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:34:40.372748 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:40.372702 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" event={"ID":"c6b6eb45-725d-42ab-9865-6651f2462862","Type":"ContainerStarted","Data":"c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d"} Apr 17 08:34:40.372748 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:40.372743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" event={"ID":"c6b6eb45-725d-42ab-9865-6651f2462862","Type":"ContainerStarted","Data":"5f7c2fbb356ebcbaaf041f76a7865a97c72fdb654eb635cd06ccb15afc479676"} Apr 17 08:34:43.123121 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.123099 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:34:43.222695 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.222670 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/289f51f8-d645-4ecb-8009-ec11650efe99-kserve-provision-location\") pod \"289f51f8-d645-4ecb-8009-ec11650efe99\" (UID: \"289f51f8-d645-4ecb-8009-ec11650efe99\") " Apr 17 08:34:43.222954 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.222934 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/289f51f8-d645-4ecb-8009-ec11650efe99-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "289f51f8-d645-4ecb-8009-ec11650efe99" (UID: "289f51f8-d645-4ecb-8009-ec11650efe99"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:34:43.323628 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.323597 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/289f51f8-d645-4ecb-8009-ec11650efe99-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:34:43.383844 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.383816 2573 generic.go:358] "Generic (PLEG): container finished" podID="289f51f8-d645-4ecb-8009-ec11650efe99" containerID="bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0" exitCode=0 Apr 17 08:34:43.383962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.383876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" event={"ID":"289f51f8-d645-4ecb-8009-ec11650efe99","Type":"ContainerDied","Data":"bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0"} Apr 17 08:34:43.383962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.383897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" event={"ID":"289f51f8-d645-4ecb-8009-ec11650efe99","Type":"ContainerDied","Data":"a6f5cd6c54016692a3ae4dc2e191768d4de42d04437a4ac0c1a6789d217fc7c3"} Apr 17 08:34:43.383962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.383902 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26" Apr 17 08:34:43.383962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.383909 2573 scope.go:117] "RemoveContainer" containerID="bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0" Apr 17 08:34:43.391028 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.390810 2573 scope.go:117] "RemoveContainer" containerID="2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5" Apr 17 08:34:43.397289 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.397275 2573 scope.go:117] "RemoveContainer" containerID="bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0" Apr 17 08:34:43.397538 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:34:43.397521 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0\": container with ID starting with bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0 not found: ID does not exist" containerID="bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0" Apr 17 08:34:43.397587 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.397546 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0"} err="failed to get container status \"bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0\": rpc error: code = NotFound desc = could not find container \"bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0\": container with ID starting with bee5a0f4afff6dd1879b60112162e50d53184d6854a8ebc1e8e98ed4fc874de0 not found: ID does not exist" Apr 17 08:34:43.397587 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.397561 2573 scope.go:117] "RemoveContainer" containerID="2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5" Apr 17 08:34:43.397788 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:34:43.397773 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5\": container with ID starting with 2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5 not found: ID does not exist" containerID="2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5" Apr 17 08:34:43.397835 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.397790 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5"} err="failed to get container status \"2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5\": rpc error: code = NotFound desc = could not find container \"2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5\": container with ID starting with 2bc2d94fbcd31549b4d4eddd173e82de56e40e45220aa4df34bac69a5e14c9e5 not found: ID does not exist" Apr 17 08:34:43.402538 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.402517 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26"] Apr 17 08:34:43.404660 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:43.404639 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-phs26"] Apr 17 08:34:44.388858 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:44.388829 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6b6eb45-725d-42ab-9865-6651f2462862" containerID="c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d" exitCode=0 Apr 17 08:34:44.389203 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:44.388865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" event={"ID":"c6b6eb45-725d-42ab-9865-6651f2462862","Type":"ContainerDied","Data":"c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d"} Apr 17 08:34:45.163744 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:45.163714 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" path="/var/lib/kubelet/pods/289f51f8-d645-4ecb-8009-ec11650efe99/volumes" Apr 17 08:34:45.392749 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:45.392716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" event={"ID":"c6b6eb45-725d-42ab-9865-6651f2462862","Type":"ContainerStarted","Data":"83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c"} Apr 17 08:34:45.393122 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:45.392989 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:34:45.394090 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:45.394070 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:34:45.407617 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:45.407580 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podStartSLOduration=6.407568429 podStartE2EDuration="6.407568429s" podCreationTimestamp="2026-04-17 08:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:34:45.406637729 +0000 UTC m=+2612.749889191" watchObservedRunningTime="2026-04-17 08:34:45.407568429 +0000 UTC m=+2612.750819890" Apr 17 08:34:46.396119 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:46.396082 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:34:56.397077 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:34:56.397030 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:35:06.396304 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:06.396256 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:35:16.396236 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:16.396193 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:35:26.396550 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:26.396503 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:35:36.396460 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:36.396415 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 08:35:46.397743 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:46.397696 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:35:49.490423 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.490372 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6"] Apr 17 08:35:49.490802 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.490607 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" containerID="cri-o://83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c" gracePeriod=30 Apr 17 08:35:49.560689 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.560662 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr"] Apr 17 08:35:49.560964 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.560948 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" Apr 17 08:35:49.561043 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.560967 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" Apr 17 08:35:49.561043 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.560991 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="storage-initializer" Apr 17 08:35:49.561043 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.561000 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="storage-initializer" Apr 17 08:35:49.561210 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.561070 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="289f51f8-d645-4ecb-8009-ec11650efe99" containerName="kserve-container" Apr 17 08:35:49.564035 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.564014 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:35:49.571565 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.571541 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr"] Apr 17 08:35:49.665431 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.665399 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aed19bcc-01d3-49cf-97a6-7984eaef58d8-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-l28fr\" (UID: \"aed19bcc-01d3-49cf-97a6-7984eaef58d8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:35:49.766541 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.766421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aed19bcc-01d3-49cf-97a6-7984eaef58d8-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-l28fr\" (UID: \"aed19bcc-01d3-49cf-97a6-7984eaef58d8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:35:49.770410 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.766906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aed19bcc-01d3-49cf-97a6-7984eaef58d8-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-l28fr\" (UID: \"aed19bcc-01d3-49cf-97a6-7984eaef58d8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:35:49.874280 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.874253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:35:49.994659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:49.994545 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr"] Apr 17 08:35:49.997660 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:35:49.997627 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaed19bcc_01d3_49cf_97a6_7984eaef58d8.slice/crio-a5469575487b892f45cda39d45d9bd5d7e2e687a68982b770699751966205166 WatchSource:0}: Error finding container a5469575487b892f45cda39d45d9bd5d7e2e687a68982b770699751966205166: Status 404 returned error can't find the container with id a5469575487b892f45cda39d45d9bd5d7e2e687a68982b770699751966205166 Apr 17 08:35:50.567655 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:50.567625 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" event={"ID":"aed19bcc-01d3-49cf-97a6-7984eaef58d8","Type":"ContainerStarted","Data":"c3e26a12d4a3ff8f1e6620cf7f1158166cfa9035ff3e688fff114a2208756108"} Apr 17 08:35:50.567655 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:50.567657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" event={"ID":"aed19bcc-01d3-49cf-97a6-7984eaef58d8","Type":"ContainerStarted","Data":"a5469575487b892f45cda39d45d9bd5d7e2e687a68982b770699751966205166"} Apr 17 08:35:53.424300 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.424279 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:35:53.492126 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.492052 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6b6eb45-725d-42ab-9865-6651f2462862-kserve-provision-location\") pod \"c6b6eb45-725d-42ab-9865-6651f2462862\" (UID: \"c6b6eb45-725d-42ab-9865-6651f2462862\") " Apr 17 08:35:53.492350 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.492328 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b6eb45-725d-42ab-9865-6651f2462862-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6b6eb45-725d-42ab-9865-6651f2462862" (UID: "c6b6eb45-725d-42ab-9865-6651f2462862"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:35:53.575818 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.575785 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6b6eb45-725d-42ab-9865-6651f2462862" containerID="83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c" exitCode=0 Apr 17 08:35:53.575925 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.575822 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" event={"ID":"c6b6eb45-725d-42ab-9865-6651f2462862","Type":"ContainerDied","Data":"83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c"} Apr 17 08:35:53.575925 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.575859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" event={"ID":"c6b6eb45-725d-42ab-9865-6651f2462862","Type":"ContainerDied","Data":"5f7c2fbb356ebcbaaf041f76a7865a97c72fdb654eb635cd06ccb15afc479676"} Apr 17 08:35:53.575925 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.575861 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6" Apr 17 08:35:53.576076 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.575923 2573 scope.go:117] "RemoveContainer" containerID="83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c" Apr 17 08:35:53.583547 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.583530 2573 scope.go:117] "RemoveContainer" containerID="c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d" Apr 17 08:35:53.590107 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.590092 2573 scope.go:117] "RemoveContainer" containerID="83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c" Apr 17 08:35:53.590354 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:35:53.590336 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c\": container with ID starting with 83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c not found: ID does not exist" containerID="83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c" Apr 17 08:35:53.590420 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.590362 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c"} err="failed to get container status \"83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c\": rpc error: code = NotFound desc = could not find container \"83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c\": container with ID starting with 83bb75fbc8aaea810c72417d528c3c9c3b1b175f050d95e334e92332f6cac45c not found: ID does not exist" Apr 17 08:35:53.590420 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.590400 2573 scope.go:117] "RemoveContainer" containerID="c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d" Apr 17 08:35:53.590644 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:35:53.590628 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d\": container with ID starting with c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d not found: ID does not exist" containerID="c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d" Apr 17 08:35:53.590681 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.590650 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d"} err="failed to get container status \"c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d\": rpc error: code = NotFound desc = could not find container \"c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d\": container with ID starting with c96e8196cfed03fd151ab12ead9ac1f409f909ef8a9e79d47a8d4d876545c85d not found: ID does not exist" Apr 17 08:35:53.592944 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.592929 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6b6eb45-725d-42ab-9865-6651f2462862-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:35:53.595572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.595542 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6"] Apr 17 08:35:53.598421 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:53.598401 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-k7rd6"] Apr 17 08:35:54.580212 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:54.580185 2573 generic.go:358] "Generic (PLEG): container finished" podID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerID="c3e26a12d4a3ff8f1e6620cf7f1158166cfa9035ff3e688fff114a2208756108" exitCode=0 Apr 17 08:35:54.580616 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:54.580259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" event={"ID":"aed19bcc-01d3-49cf-97a6-7984eaef58d8","Type":"ContainerDied","Data":"c3e26a12d4a3ff8f1e6620cf7f1158166cfa9035ff3e688fff114a2208756108"} Apr 17 08:35:55.165048 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:55.165010 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" path="/var/lib/kubelet/pods/c6b6eb45-725d-42ab-9865-6651f2462862/volumes" Apr 17 08:35:59.598960 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:59.598926 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" event={"ID":"aed19bcc-01d3-49cf-97a6-7984eaef58d8","Type":"ContainerStarted","Data":"45c10746c6d4ed0b5fdcbd67add34b05e7e201fcac077e67343b89409c12e31d"} Apr 17 08:35:59.599354 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:59.599220 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:35:59.600527 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:59.600490 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 08:35:59.615394 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:35:59.615340 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" podStartSLOduration=6.287505189 podStartE2EDuration="10.61532831s" podCreationTimestamp="2026-04-17 08:35:49 +0000 UTC" firstStartedPulling="2026-04-17 08:35:54.581466076 +0000 UTC m=+2681.924717517" lastFinishedPulling="2026-04-17 08:35:58.909289193 +0000 UTC m=+2686.252540638" observedRunningTime="2026-04-17 08:35:59.613654721 +0000 UTC m=+2686.956906181" watchObservedRunningTime="2026-04-17 08:35:59.61532831 +0000 UTC m=+2686.958579774" Apr 17 08:36:00.602538 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:00.602492 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 08:36:10.603993 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:10.603899 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:36:29.330811 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.330781 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr"] Apr 17 08:36:29.331288 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.331047 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="kserve-container" containerID="cri-o://45c10746c6d4ed0b5fdcbd67add34b05e7e201fcac077e67343b89409c12e31d" gracePeriod=30 Apr 17 08:36:29.384769 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.384741 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r"] Apr 17 08:36:29.385055 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.385038 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="storage-initializer" Apr 17 08:36:29.385140 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.385058 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="storage-initializer" Apr 17 08:36:29.385140 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.385068 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" Apr 17 08:36:29.385140 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.385077 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" Apr 17 08:36:29.385299 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.385154 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6b6eb45-725d-42ab-9865-6651f2462862" containerName="kserve-container" Apr 17 08:36:29.388212 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.388192 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:29.394530 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.394508 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r"] Apr 17 08:36:29.526425 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.526373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cbed1b8-beef-479f-8834-f1a8f8ea4299-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r\" (UID: \"8cbed1b8-beef-479f-8834-f1a8f8ea4299\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:29.627405 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.627330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cbed1b8-beef-479f-8834-f1a8f8ea4299-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r\" (UID: \"8cbed1b8-beef-479f-8834-f1a8f8ea4299\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:29.627696 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.627677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cbed1b8-beef-479f-8834-f1a8f8ea4299-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r\" (UID: \"8cbed1b8-beef-479f-8834-f1a8f8ea4299\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:29.698287 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.698259 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:29.822191 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:29.822144 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r"] Apr 17 08:36:29.826171 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:36:29.826139 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cbed1b8_beef_479f_8834_f1a8f8ea4299.slice/crio-7aaae3dbf0005b2357f3bab440ff4f30705d834199ebaf82b25fda54c6ffcc2c WatchSource:0}: Error finding container 7aaae3dbf0005b2357f3bab440ff4f30705d834199ebaf82b25fda54c6ffcc2c: Status 404 returned error can't find the container with id 7aaae3dbf0005b2357f3bab440ff4f30705d834199ebaf82b25fda54c6ffcc2c Apr 17 08:36:30.682903 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:30.682867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" event={"ID":"8cbed1b8-beef-479f-8834-f1a8f8ea4299","Type":"ContainerStarted","Data":"281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5"} Apr 17 08:36:30.682903 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:30.682901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" event={"ID":"8cbed1b8-beef-479f-8834-f1a8f8ea4299","Type":"ContainerStarted","Data":"7aaae3dbf0005b2357f3bab440ff4f30705d834199ebaf82b25fda54c6ffcc2c"} Apr 17 08:36:34.695609 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:34.695568 2573 generic.go:358] "Generic (PLEG): container finished" podID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerID="281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5" exitCode=0 Apr 17 08:36:34.696062 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:34.695649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" event={"ID":"8cbed1b8-beef-479f-8834-f1a8f8ea4299","Type":"ContainerDied","Data":"281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5"} Apr 17 08:36:35.700701 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:35.700669 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" event={"ID":"8cbed1b8-beef-479f-8834-f1a8f8ea4299","Type":"ContainerStarted","Data":"211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf"} Apr 17 08:36:35.701093 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:35.701031 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:35.702093 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:35.702065 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 08:36:35.716326 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:35.716280 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" podStartSLOduration=6.716269298 podStartE2EDuration="6.716269298s" podCreationTimestamp="2026-04-17 08:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:36:35.715335453 +0000 UTC m=+2723.058586915" watchObservedRunningTime="2026-04-17 08:36:35.716269298 +0000 UTC m=+2723.059520761" Apr 17 08:36:36.703681 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:36.703643 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 08:36:46.705370 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:46.705329 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:36:59.141424 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.141373 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r"] Apr 17 08:36:59.141805 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.141729 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="kserve-container" containerID="cri-o://211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf" gracePeriod=30 Apr 17 08:36:59.180850 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.180821 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds"] Apr 17 08:36:59.183865 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.183849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:36:59.192164 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.192141 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds"] Apr 17 08:36:59.221990 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.221968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4afdca10-2538-4d5a-b169-22ec98e22050-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-7rsds\" (UID: \"4afdca10-2538-4d5a-b169-22ec98e22050\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:36:59.323045 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.323015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4afdca10-2538-4d5a-b169-22ec98e22050-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-7rsds\" (UID: \"4afdca10-2538-4d5a-b169-22ec98e22050\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:36:59.323445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.323413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4afdca10-2538-4d5a-b169-22ec98e22050-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-7rsds\" (UID: \"4afdca10-2538-4d5a-b169-22ec98e22050\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:36:59.493931 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.493905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:36:59.606567 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.606538 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds"] Apr 17 08:36:59.609114 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:36:59.609082 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4afdca10_2538_4d5a_b169_22ec98e22050.slice/crio-339ab5035d2d5f4d751d2a93221156aa26e667aadb76f2ff79f7a38a0105b4c8 WatchSource:0}: Error finding container 339ab5035d2d5f4d751d2a93221156aa26e667aadb76f2ff79f7a38a0105b4c8: Status 404 returned error can't find the container with id 339ab5035d2d5f4d751d2a93221156aa26e667aadb76f2ff79f7a38a0105b4c8 Apr 17 08:36:59.765798 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.765770 2573 generic.go:358] "Generic (PLEG): container finished" podID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerID="45c10746c6d4ed0b5fdcbd67add34b05e7e201fcac077e67343b89409c12e31d" exitCode=137 Apr 17 08:36:59.765949 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.765834 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" event={"ID":"aed19bcc-01d3-49cf-97a6-7984eaef58d8","Type":"ContainerDied","Data":"45c10746c6d4ed0b5fdcbd67add34b05e7e201fcac077e67343b89409c12e31d"} Apr 17 08:36:59.767059 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.767033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" event={"ID":"4afdca10-2538-4d5a-b169-22ec98e22050","Type":"ContainerStarted","Data":"0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366"} Apr 17 08:36:59.767174 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.767063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" event={"ID":"4afdca10-2538-4d5a-b169-22ec98e22050","Type":"ContainerStarted","Data":"339ab5035d2d5f4d751d2a93221156aa26e667aadb76f2ff79f7a38a0105b4c8"} Apr 17 08:36:59.961163 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:36:59.961143 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:37:00.030278 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.030254 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aed19bcc-01d3-49cf-97a6-7984eaef58d8-kserve-provision-location\") pod \"aed19bcc-01d3-49cf-97a6-7984eaef58d8\" (UID: \"aed19bcc-01d3-49cf-97a6-7984eaef58d8\") " Apr 17 08:37:00.040467 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.040434 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aed19bcc-01d3-49cf-97a6-7984eaef58d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aed19bcc-01d3-49cf-97a6-7984eaef58d8" (UID: "aed19bcc-01d3-49cf-97a6-7984eaef58d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:37:00.131030 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.131001 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aed19bcc-01d3-49cf-97a6-7984eaef58d8-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:37:00.771486 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.771458 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" Apr 17 08:37:00.771486 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.771459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr" event={"ID":"aed19bcc-01d3-49cf-97a6-7984eaef58d8","Type":"ContainerDied","Data":"a5469575487b892f45cda39d45d9bd5d7e2e687a68982b770699751966205166"} Apr 17 08:37:00.771949 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.771519 2573 scope.go:117] "RemoveContainer" containerID="45c10746c6d4ed0b5fdcbd67add34b05e7e201fcac077e67343b89409c12e31d" Apr 17 08:37:00.785952 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.785923 2573 scope.go:117] "RemoveContainer" containerID="c3e26a12d4a3ff8f1e6620cf7f1158166cfa9035ff3e688fff114a2208756108" Apr 17 08:37:00.795724 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.795696 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr"] Apr 17 08:37:00.798719 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:00.798695 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-l28fr"] Apr 17 08:37:01.164772 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:01.164699 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" path="/var/lib/kubelet/pods/aed19bcc-01d3-49cf-97a6-7984eaef58d8/volumes" Apr 17 08:37:03.781352 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:03.781273 2573 generic.go:358] "Generic (PLEG): container finished" podID="4afdca10-2538-4d5a-b169-22ec98e22050" containerID="0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366" exitCode=0 Apr 17 08:37:03.781732 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:03.781343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" event={"ID":"4afdca10-2538-4d5a-b169-22ec98e22050","Type":"ContainerDied","Data":"0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366"} Apr 17 08:37:29.825909 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.825871 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:37:29.885122 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.885079 2573 generic.go:358] "Generic (PLEG): container finished" podID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerID="211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf" exitCode=137 Apr 17 08:37:29.885293 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.885131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" event={"ID":"8cbed1b8-beef-479f-8834-f1a8f8ea4299","Type":"ContainerDied","Data":"211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf"} Apr 17 08:37:29.885293 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.885149 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" Apr 17 08:37:29.885293 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.885169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r" event={"ID":"8cbed1b8-beef-479f-8834-f1a8f8ea4299","Type":"ContainerDied","Data":"7aaae3dbf0005b2357f3bab440ff4f30705d834199ebaf82b25fda54c6ffcc2c"} Apr 17 08:37:29.885293 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.885187 2573 scope.go:117] "RemoveContainer" containerID="211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf" Apr 17 08:37:29.893955 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.893919 2573 scope.go:117] "RemoveContainer" containerID="281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5" Apr 17 08:37:29.902655 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.902633 2573 scope.go:117] "RemoveContainer" containerID="211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf" Apr 17 08:37:29.902910 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:37:29.902885 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf\": container with ID starting with 211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf not found: ID does not exist" containerID="211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf" Apr 17 08:37:29.903006 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.902920 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf"} err="failed to get container status \"211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf\": rpc error: code = NotFound desc = could not find container \"211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf\": container with ID starting with 211b4287398cc1714cfd1ce4877ce212f9446efc2b7b8c4287ba4329aba1bddf not found: ID does not exist" Apr 17 08:37:29.903006 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.902955 2573 scope.go:117] "RemoveContainer" containerID="281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5" Apr 17 08:37:29.903209 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:37:29.903190 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5\": container with ID starting with 281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5 not found: ID does not exist" containerID="281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5" Apr 17 08:37:29.903270 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.903216 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5"} err="failed to get container status \"281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5\": rpc error: code = NotFound desc = could not find container \"281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5\": container with ID starting with 281dd0fef9cd1464daa0fde2d0cebb3aa4b668440b6edb398a28f431f6bb59e5 not found: ID does not exist" Apr 17 08:37:29.969973 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.969951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cbed1b8-beef-479f-8834-f1a8f8ea4299-kserve-provision-location\") pod \"8cbed1b8-beef-479f-8834-f1a8f8ea4299\" (UID: \"8cbed1b8-beef-479f-8834-f1a8f8ea4299\") " Apr 17 08:37:29.974535 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:29.974502 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbed1b8-beef-479f-8834-f1a8f8ea4299-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8cbed1b8-beef-479f-8834-f1a8f8ea4299" (UID: "8cbed1b8-beef-479f-8834-f1a8f8ea4299"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:37:30.071275 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:30.071245 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cbed1b8-beef-479f-8834-f1a8f8ea4299-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:37:30.210455 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:30.210428 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r"] Apr 17 08:37:30.213948 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:30.213903 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-rkx7r"] Apr 17 08:37:31.164843 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:37:31.164800 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" path="/var/lib/kubelet/pods/8cbed1b8-beef-479f-8834-f1a8f8ea4299/volumes" Apr 17 08:38:59.141005 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:38:59.140966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" event={"ID":"4afdca10-2538-4d5a-b169-22ec98e22050","Type":"ContainerStarted","Data":"158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a"} Apr 17 08:38:59.141428 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:38:59.141286 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:38:59.142435 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:38:59.142408 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 08:38:59.156424 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:38:59.156358 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" podStartSLOduration=5.592528489 podStartE2EDuration="2m0.156341637s" podCreationTimestamp="2026-04-17 08:36:59 +0000 UTC" firstStartedPulling="2026-04-17 08:37:03.782321922 +0000 UTC m=+2751.125573364" lastFinishedPulling="2026-04-17 08:38:58.346135071 +0000 UTC m=+2865.689386512" observedRunningTime="2026-04-17 08:38:59.15560641 +0000 UTC m=+2866.498857875" watchObservedRunningTime="2026-04-17 08:38:59.156341637 +0000 UTC m=+2866.499593099" Apr 17 08:39:00.143821 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:00.143783 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 08:39:10.144514 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:10.144436 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:39:21.559102 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.559068 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds"] Apr 17 08:39:21.559630 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.559440 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="kserve-container" containerID="cri-o://158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a" gracePeriod=30 Apr 17 08:39:21.643264 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643228 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7"] Apr 17 08:39:21.643579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643563 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="storage-initializer" Apr 17 08:39:21.643659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643582 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="storage-initializer" Apr 17 08:39:21.643659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643620 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="storage-initializer" Apr 17 08:39:21.643659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643630 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="storage-initializer" Apr 17 08:39:21.643659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643643 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="kserve-container" Apr 17 08:39:21.643659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643652 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="kserve-container" Apr 17 08:39:21.643914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643665 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="kserve-container" Apr 17 08:39:21.643914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643673 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="kserve-container" Apr 17 08:39:21.643914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643747 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="aed19bcc-01d3-49cf-97a6-7984eaef58d8" containerName="kserve-container" Apr 17 08:39:21.643914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.643761 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cbed1b8-beef-479f-8834-f1a8f8ea4299" containerName="kserve-container" Apr 17 08:39:21.646705 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.646683 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:39:21.655481 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.655457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7"] Apr 17 08:39:21.810294 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.810199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe393d16-04a2-43bb-b11b-2b7873f9ed96-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-crjw7\" (UID: \"fe393d16-04a2-43bb-b11b-2b7873f9ed96\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:39:21.911047 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.910995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe393d16-04a2-43bb-b11b-2b7873f9ed96-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-crjw7\" (UID: \"fe393d16-04a2-43bb-b11b-2b7873f9ed96\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:39:21.911442 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.911424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe393d16-04a2-43bb-b11b-2b7873f9ed96-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-crjw7\" (UID: \"fe393d16-04a2-43bb-b11b-2b7873f9ed96\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:39:21.957154 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:21.957120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:39:22.080547 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:22.080520 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7"] Apr 17 08:39:22.083004 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:39:22.082961 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe393d16_04a2_43bb_b11b_2b7873f9ed96.slice/crio-1a4e656d6c470c3c23f9e218b4ce43bd610be1eb31a38eeaa9ae1166cefd50af WatchSource:0}: Error finding container 1a4e656d6c470c3c23f9e218b4ce43bd610be1eb31a38eeaa9ae1166cefd50af: Status 404 returned error can't find the container with id 1a4e656d6c470c3c23f9e218b4ce43bd610be1eb31a38eeaa9ae1166cefd50af Apr 17 08:39:22.197694 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:22.197652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" event={"ID":"fe393d16-04a2-43bb-b11b-2b7873f9ed96","Type":"ContainerStarted","Data":"e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e"} Apr 17 08:39:22.197860 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:22.197698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" event={"ID":"fe393d16-04a2-43bb-b11b-2b7873f9ed96","Type":"ContainerStarted","Data":"1a4e656d6c470c3c23f9e218b4ce43bd610be1eb31a38eeaa9ae1166cefd50af"} Apr 17 08:39:24.111994 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.111966 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:39:24.204696 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.204664 2573 generic.go:358] "Generic (PLEG): container finished" podID="4afdca10-2538-4d5a-b169-22ec98e22050" containerID="158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a" exitCode=0 Apr 17 08:39:24.204882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.204756 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" Apr 17 08:39:24.204882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.204756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" event={"ID":"4afdca10-2538-4d5a-b169-22ec98e22050","Type":"ContainerDied","Data":"158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a"} Apr 17 08:39:24.204882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.204799 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds" event={"ID":"4afdca10-2538-4d5a-b169-22ec98e22050","Type":"ContainerDied","Data":"339ab5035d2d5f4d751d2a93221156aa26e667aadb76f2ff79f7a38a0105b4c8"} Apr 17 08:39:24.204882 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.204816 2573 scope.go:117] "RemoveContainer" containerID="158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a" Apr 17 08:39:24.212441 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.212419 2573 scope.go:117] "RemoveContainer" containerID="0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366" Apr 17 08:39:24.219253 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.219229 2573 scope.go:117] "RemoveContainer" containerID="158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a" Apr 17 08:39:24.219541 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:39:24.219521 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a\": container with ID starting with 158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a not found: ID does not exist" containerID="158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a" Apr 17 08:39:24.219623 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.219549 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a"} err="failed to get container status \"158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a\": rpc error: code = NotFound desc = could not find container \"158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a\": container with ID starting with 158bf930bea657f8772e034907019c8ba0050fa70795b1ff1135b29e6d017e8a not found: ID does not exist" Apr 17 08:39:24.219623 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.219567 2573 scope.go:117] "RemoveContainer" containerID="0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366" Apr 17 08:39:24.219818 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:39:24.219802 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366\": container with ID starting with 0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366 not found: ID does not exist" containerID="0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366" Apr 17 08:39:24.219943 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.219822 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366"} err="failed to get container status \"0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366\": rpc error: code = NotFound desc = could not find container \"0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366\": container with ID starting with 0d428c06051832242a138be98c23cea26f365e1c81ab4b5142175fd9c949d366 not found: ID does not exist" Apr 17 08:39:24.226071 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.226052 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4afdca10-2538-4d5a-b169-22ec98e22050-kserve-provision-location\") pod \"4afdca10-2538-4d5a-b169-22ec98e22050\" (UID: \"4afdca10-2538-4d5a-b169-22ec98e22050\") " Apr 17 08:39:24.226436 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.226412 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afdca10-2538-4d5a-b169-22ec98e22050-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4afdca10-2538-4d5a-b169-22ec98e22050" (UID: "4afdca10-2538-4d5a-b169-22ec98e22050"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:39:24.327566 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.327527 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4afdca10-2538-4d5a-b169-22ec98e22050-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:39:24.525241 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.525212 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds"] Apr 17 08:39:24.529145 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:24.529122 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-7rsds"] Apr 17 08:39:25.163227 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:25.163192 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" path="/var/lib/kubelet/pods/4afdca10-2538-4d5a-b169-22ec98e22050/volumes" Apr 17 08:39:26.212540 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:26.212500 2573 generic.go:358] "Generic (PLEG): container finished" podID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerID="e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e" exitCode=0 Apr 17 08:39:26.212540 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:26.212538 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" event={"ID":"fe393d16-04a2-43bb-b11b-2b7873f9ed96","Type":"ContainerDied","Data":"e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e"} Apr 17 08:39:46.272909 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:46.272870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" event={"ID":"fe393d16-04a2-43bb-b11b-2b7873f9ed96","Type":"ContainerStarted","Data":"1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5"} Apr 17 08:39:46.273372 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:46.273201 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:39:46.274362 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:46.274327 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:39:46.287678 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:46.287632 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podStartSLOduration=6.041931342 podStartE2EDuration="25.287599356s" podCreationTimestamp="2026-04-17 08:39:21 +0000 UTC" firstStartedPulling="2026-04-17 08:39:26.21360572 +0000 UTC m=+2893.556857161" lastFinishedPulling="2026-04-17 08:39:45.459273729 +0000 UTC m=+2912.802525175" observedRunningTime="2026-04-17 08:39:46.287285031 +0000 UTC m=+2913.630536490" watchObservedRunningTime="2026-04-17 08:39:46.287599356 +0000 UTC m=+2913.630850816" Apr 17 08:39:47.275221 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:47.275182 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:39:57.275813 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:39:57.275768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:40:07.275728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:07.275687 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:40:17.275530 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:17.275494 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:40:27.275477 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:27.275437 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:40:37.276280 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:37.276174 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 08:40:47.276240 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:47.276212 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:40:51.777603 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.777567 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7"] Apr 17 08:40:51.778052 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.777880 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" containerID="cri-o://1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5" gracePeriod=30 Apr 17 08:40:51.855411 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.855354 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4"] Apr 17 08:40:51.855742 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.855724 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="kserve-container" Apr 17 08:40:51.855742 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.855743 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="kserve-container" Apr 17 08:40:51.855881 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.855775 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="storage-initializer" Apr 17 08:40:51.855881 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.855784 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="storage-initializer" Apr 17 08:40:51.855881 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.855862 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4afdca10-2538-4d5a-b169-22ec98e22050" containerName="kserve-container" Apr 17 08:40:51.858736 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.858716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:40:51.865832 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.865803 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4"] Apr 17 08:40:51.917326 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:51.917296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3df19379-afa2-4117-8a3b-112889674309-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4\" (UID: \"3df19379-afa2-4117-8a3b-112889674309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:40:52.018470 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.018441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3df19379-afa2-4117-8a3b-112889674309-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4\" (UID: \"3df19379-afa2-4117-8a3b-112889674309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:40:52.018794 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.018776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3df19379-afa2-4117-8a3b-112889674309-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4\" (UID: \"3df19379-afa2-4117-8a3b-112889674309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:40:52.168766 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.168697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:40:52.287007 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.286975 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4"] Apr 17 08:40:52.289964 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:40:52.289939 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df19379_afa2_4117_8a3b_112889674309.slice/crio-f9e9185c95f34a52faa213e51fe573015b501b260ea5b984e2f0f27af170d002 WatchSource:0}: Error finding container f9e9185c95f34a52faa213e51fe573015b501b260ea5b984e2f0f27af170d002: Status 404 returned error can't find the container with id f9e9185c95f34a52faa213e51fe573015b501b260ea5b984e2f0f27af170d002 Apr 17 08:40:52.292227 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.292211 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:40:52.435995 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.435968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" event={"ID":"3df19379-afa2-4117-8a3b-112889674309","Type":"ContainerStarted","Data":"60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e"} Apr 17 08:40:52.436128 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:52.436005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" event={"ID":"3df19379-afa2-4117-8a3b-112889674309","Type":"ContainerStarted","Data":"f9e9185c95f34a52faa213e51fe573015b501b260ea5b984e2f0f27af170d002"} Apr 17 08:40:55.007932 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.007910 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:40:55.138782 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.138709 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe393d16-04a2-43bb-b11b-2b7873f9ed96-kserve-provision-location\") pod \"fe393d16-04a2-43bb-b11b-2b7873f9ed96\" (UID: \"fe393d16-04a2-43bb-b11b-2b7873f9ed96\") " Apr 17 08:40:55.139005 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.138986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe393d16-04a2-43bb-b11b-2b7873f9ed96-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe393d16-04a2-43bb-b11b-2b7873f9ed96" (UID: "fe393d16-04a2-43bb-b11b-2b7873f9ed96"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:40:55.240140 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.240109 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe393d16-04a2-43bb-b11b-2b7873f9ed96-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:40:55.444788 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.444757 2573 generic.go:358] "Generic (PLEG): container finished" podID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerID="1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5" exitCode=0 Apr 17 08:40:55.444953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.444816 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" Apr 17 08:40:55.444953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.444813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" event={"ID":"fe393d16-04a2-43bb-b11b-2b7873f9ed96","Type":"ContainerDied","Data":"1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5"} Apr 17 08:40:55.444953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.444931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7" event={"ID":"fe393d16-04a2-43bb-b11b-2b7873f9ed96","Type":"ContainerDied","Data":"1a4e656d6c470c3c23f9e218b4ce43bd610be1eb31a38eeaa9ae1166cefd50af"} Apr 17 08:40:55.444953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.444952 2573 scope.go:117] "RemoveContainer" containerID="1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5" Apr 17 08:40:55.452445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.452431 2573 scope.go:117] "RemoveContainer" containerID="e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e" Apr 17 08:40:55.459222 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.459201 2573 scope.go:117] "RemoveContainer" containerID="1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5" Apr 17 08:40:55.459498 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:40:55.459480 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5\": container with ID starting with 1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5 not found: ID does not exist" containerID="1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5" Apr 17 08:40:55.459572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.459506 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5"} err="failed to get container status \"1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5\": rpc error: code = NotFound desc = could not find container \"1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5\": container with ID starting with 1dca6030ba8d3f06ac796f2dcbb6e8aab522fa60df4738adbac6a76962f93bb5 not found: ID does not exist" Apr 17 08:40:55.459572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.459522 2573 scope.go:117] "RemoveContainer" containerID="e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e" Apr 17 08:40:55.459572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.459556 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7"] Apr 17 08:40:55.459766 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:40:55.459742 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e\": container with ID starting with e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e not found: ID does not exist" containerID="e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e" Apr 17 08:40:55.459830 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.459775 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e"} err="failed to get container status \"e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e\": rpc error: code = NotFound desc = could not find container \"e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e\": container with ID starting with e23d3e50aab0c70947cfd11ac1ec70bdccdd3944f0a1d6373ef5f35588edf85e not found: ID does not exist" Apr 17 08:40:55.461812 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:55.461794 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-crjw7"] Apr 17 08:40:56.448762 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:56.448735 2573 generic.go:358] "Generic (PLEG): container finished" podID="3df19379-afa2-4117-8a3b-112889674309" containerID="60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e" exitCode=0 Apr 17 08:40:56.449125 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:56.448812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" event={"ID":"3df19379-afa2-4117-8a3b-112889674309","Type":"ContainerDied","Data":"60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e"} Apr 17 08:40:57.164344 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:57.164302 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" path="/var/lib/kubelet/pods/fe393d16-04a2-43bb-b11b-2b7873f9ed96/volumes" Apr 17 08:40:57.452519 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:57.452489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" event={"ID":"3df19379-afa2-4117-8a3b-112889674309","Type":"ContainerStarted","Data":"b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe"} Apr 17 08:40:57.452870 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:57.452690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:40:57.468909 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:40:57.468862 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" podStartSLOduration=6.468850757 podStartE2EDuration="6.468850757s" podCreationTimestamp="2026-04-17 08:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:40:57.466968015 +0000 UTC m=+2984.810219489" watchObservedRunningTime="2026-04-17 08:40:57.468850757 +0000 UTC m=+2984.812102219" Apr 17 08:41:28.503014 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:28.502977 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:41:31.924114 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.924075 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4"] Apr 17 08:41:31.924594 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.924452 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" podUID="3df19379-afa2-4117-8a3b-112889674309" containerName="kserve-container" containerID="cri-o://b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe" gracePeriod=30 Apr 17 08:41:31.984001 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.983967 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl"] Apr 17 08:41:31.984278 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.984263 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" Apr 17 08:41:31.984359 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.984282 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" Apr 17 08:41:31.984359 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.984302 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="storage-initializer" Apr 17 08:41:31.984359 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.984311 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="storage-initializer" Apr 17 08:41:31.984551 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.984370 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe393d16-04a2-43bb-b11b-2b7873f9ed96" containerName="kserve-container" Apr 17 08:41:31.987426 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.987407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:41:31.994676 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:31.994649 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl"] Apr 17 08:41:32.087571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.087546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c3df37-1a9d-4bb7-b041-25241af62213-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-464wl\" (UID: \"79c3df37-1a9d-4bb7-b041-25241af62213\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:41:32.188607 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.188582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c3df37-1a9d-4bb7-b041-25241af62213-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-464wl\" (UID: \"79c3df37-1a9d-4bb7-b041-25241af62213\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:41:32.188895 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.188877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c3df37-1a9d-4bb7-b041-25241af62213-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-464wl\" (UID: \"79c3df37-1a9d-4bb7-b041-25241af62213\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:41:32.297883 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.297854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:41:32.419974 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.419907 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl"] Apr 17 08:41:32.422696 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:41:32.422670 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c3df37_1a9d_4bb7_b041_25241af62213.slice/crio-beb12fb83ac257e99d619151b57e10d540a23f2361a261b5c46de1cede6b8b79 WatchSource:0}: Error finding container beb12fb83ac257e99d619151b57e10d540a23f2361a261b5c46de1cede6b8b79: Status 404 returned error can't find the container with id beb12fb83ac257e99d619151b57e10d540a23f2361a261b5c46de1cede6b8b79 Apr 17 08:41:32.543438 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.543405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" event={"ID":"79c3df37-1a9d-4bb7-b041-25241af62213","Type":"ContainerStarted","Data":"08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586"} Apr 17 08:41:32.543438 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:32.543439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" event={"ID":"79c3df37-1a9d-4bb7-b041-25241af62213","Type":"ContainerStarted","Data":"beb12fb83ac257e99d619151b57e10d540a23f2361a261b5c46de1cede6b8b79"} Apr 17 08:41:36.555686 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:36.555650 2573 generic.go:358] "Generic (PLEG): container finished" podID="79c3df37-1a9d-4bb7-b041-25241af62213" containerID="08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586" exitCode=0 Apr 17 08:41:36.556156 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:36.555730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" event={"ID":"79c3df37-1a9d-4bb7-b041-25241af62213","Type":"ContainerDied","Data":"08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586"} Apr 17 08:41:37.364612 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.364591 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:41:37.526238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.526157 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3df19379-afa2-4117-8a3b-112889674309-kserve-provision-location\") pod \"3df19379-afa2-4117-8a3b-112889674309\" (UID: \"3df19379-afa2-4117-8a3b-112889674309\") " Apr 17 08:41:37.526491 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.526466 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df19379-afa2-4117-8a3b-112889674309-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3df19379-afa2-4117-8a3b-112889674309" (UID: "3df19379-afa2-4117-8a3b-112889674309"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:41:37.559799 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.559767 2573 generic.go:358] "Generic (PLEG): container finished" podID="3df19379-afa2-4117-8a3b-112889674309" containerID="b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe" exitCode=0 Apr 17 08:41:37.560198 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.559836 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" Apr 17 08:41:37.560198 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.559855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" event={"ID":"3df19379-afa2-4117-8a3b-112889674309","Type":"ContainerDied","Data":"b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe"} Apr 17 08:41:37.560198 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.559892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4" event={"ID":"3df19379-afa2-4117-8a3b-112889674309","Type":"ContainerDied","Data":"f9e9185c95f34a52faa213e51fe573015b501b260ea5b984e2f0f27af170d002"} Apr 17 08:41:37.560198 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.559911 2573 scope.go:117] "RemoveContainer" containerID="b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe" Apr 17 08:41:37.561967 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.561936 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" event={"ID":"79c3df37-1a9d-4bb7-b041-25241af62213","Type":"ContainerStarted","Data":"588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2"} Apr 17 08:41:37.562172 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.562153 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:41:37.568473 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.568456 2573 scope.go:117] "RemoveContainer" containerID="60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e" Apr 17 08:41:37.575108 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.575094 2573 scope.go:117] "RemoveContainer" containerID="b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe" Apr 17 08:41:37.575340 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:41:37.575320 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe\": container with ID starting with b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe not found: ID does not exist" containerID="b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe" Apr 17 08:41:37.575445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.575353 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe"} err="failed to get container status \"b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe\": rpc error: code = NotFound desc = could not find container \"b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe\": container with ID starting with b555e46b0c906b8d071279243a4381367b425c3fd4326c20ddced6ffe8f56ffe not found: ID does not exist" Apr 17 08:41:37.575445 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.575398 2573 scope.go:117] "RemoveContainer" containerID="60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e" Apr 17 08:41:37.575648 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:41:37.575631 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e\": container with ID starting with 60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e not found: ID does not exist" containerID="60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e" Apr 17 08:41:37.575688 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.575656 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e"} err="failed to get container status \"60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e\": rpc error: code = NotFound desc = could not find container \"60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e\": container with ID starting with 60dd20da2aff505abd010d74484daf0755eeea8274c7c98683686643266f579e not found: ID does not exist" Apr 17 08:41:37.579367 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.579325 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" podStartSLOduration=6.579310683 podStartE2EDuration="6.579310683s" podCreationTimestamp="2026-04-17 08:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:41:37.577840394 +0000 UTC m=+3024.921091862" watchObservedRunningTime="2026-04-17 08:41:37.579310683 +0000 UTC m=+3024.922562146" Apr 17 08:41:37.589962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.589942 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4"] Apr 17 08:41:37.595640 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.595621 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-567d4"] Apr 17 08:41:37.626791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:37.626770 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3df19379-afa2-4117-8a3b-112889674309-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:41:39.163795 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:41:39.163758 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df19379-afa2-4117-8a3b-112889674309" path="/var/lib/kubelet/pods/3df19379-afa2-4117-8a3b-112889674309/volumes" Apr 17 08:42:08.602581 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:08.602497 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:42:12.119771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.119740 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl"] Apr 17 08:42:12.120171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.119993 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" containerName="kserve-container" containerID="cri-o://588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2" gracePeriod=30 Apr 17 08:42:12.139840 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.139815 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh"] Apr 17 08:42:12.140055 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.140044 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3df19379-afa2-4117-8a3b-112889674309" containerName="kserve-container" Apr 17 08:42:12.140102 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.140057 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df19379-afa2-4117-8a3b-112889674309" containerName="kserve-container" Apr 17 08:42:12.140102 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.140076 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3df19379-afa2-4117-8a3b-112889674309" containerName="storage-initializer" Apr 17 08:42:12.140102 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.140082 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df19379-afa2-4117-8a3b-112889674309" containerName="storage-initializer" Apr 17 08:42:12.140218 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.140132 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3df19379-afa2-4117-8a3b-112889674309" containerName="kserve-container" Apr 17 08:42:12.143183 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.143164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:42:12.151397 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.151354 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh"] Apr 17 08:42:12.161158 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.161135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0256e8c6-c5fc-4b24-93e8-1a44c776a006-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-hk2fh\" (UID: \"0256e8c6-c5fc-4b24-93e8-1a44c776a006\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:42:12.261639 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.261615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0256e8c6-c5fc-4b24-93e8-1a44c776a006-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-hk2fh\" (UID: \"0256e8c6-c5fc-4b24-93e8-1a44c776a006\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:42:12.261932 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.261916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0256e8c6-c5fc-4b24-93e8-1a44c776a006-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-hk2fh\" (UID: \"0256e8c6-c5fc-4b24-93e8-1a44c776a006\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:42:12.453873 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.453843 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:42:12.574485 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.574461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh"] Apr 17 08:42:12.580699 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:42:12.580666 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0256e8c6_c5fc_4b24_93e8_1a44c776a006.slice/crio-6c69617dc54c681c45b3e6f5ae5c60c542f6c13454d1cbb67091871184605d74 WatchSource:0}: Error finding container 6c69617dc54c681c45b3e6f5ae5c60c542f6c13454d1cbb67091871184605d74: Status 404 returned error can't find the container with id 6c69617dc54c681c45b3e6f5ae5c60c542f6c13454d1cbb67091871184605d74 Apr 17 08:42:12.655159 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.655131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" event={"ID":"0256e8c6-c5fc-4b24-93e8-1a44c776a006","Type":"ContainerStarted","Data":"90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32"} Apr 17 08:42:12.655159 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:12.655166 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" event={"ID":"0256e8c6-c5fc-4b24-93e8-1a44c776a006","Type":"ContainerStarted","Data":"6c69617dc54c681c45b3e6f5ae5c60c542f6c13454d1cbb67091871184605d74"} Apr 17 08:42:16.667072 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:16.666981 2573 generic.go:358] "Generic (PLEG): container finished" podID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerID="90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32" exitCode=0 Apr 17 08:42:16.667072 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:16.667057 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" event={"ID":"0256e8c6-c5fc-4b24-93e8-1a44c776a006","Type":"ContainerDied","Data":"90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32"} Apr 17 08:42:17.655158 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.655137 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:42:17.670820 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.670793 2573 generic.go:358] "Generic (PLEG): container finished" podID="79c3df37-1a9d-4bb7-b041-25241af62213" containerID="588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2" exitCode=0 Apr 17 08:42:17.671205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.670877 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" Apr 17 08:42:17.671205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.670872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" event={"ID":"79c3df37-1a9d-4bb7-b041-25241af62213","Type":"ContainerDied","Data":"588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2"} Apr 17 08:42:17.671205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.671037 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl" event={"ID":"79c3df37-1a9d-4bb7-b041-25241af62213","Type":"ContainerDied","Data":"beb12fb83ac257e99d619151b57e10d540a23f2361a261b5c46de1cede6b8b79"} Apr 17 08:42:17.671205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.671064 2573 scope.go:117] "RemoveContainer" containerID="588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2" Apr 17 08:42:17.673514 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.673489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" event={"ID":"0256e8c6-c5fc-4b24-93e8-1a44c776a006","Type":"ContainerStarted","Data":"54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e"} Apr 17 08:42:17.673796 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.673777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:42:17.675082 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.675058 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:42:17.679550 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.679530 2573 scope.go:117] "RemoveContainer" containerID="08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586" Apr 17 08:42:17.687550 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.687510 2573 scope.go:117] "RemoveContainer" containerID="588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2" Apr 17 08:42:17.687803 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:42:17.687785 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2\": container with ID starting with 588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2 not found: ID does not exist" containerID="588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2" Apr 17 08:42:17.687867 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.687815 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2"} err="failed to get container status \"588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2\": rpc error: code = NotFound desc = could not find container \"588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2\": container with ID starting with 588c0f41caac8a1268c3eeb684aa0a8114006e6f2df87ffb05af464e98d4e7c2 not found: ID does not exist" Apr 17 08:42:17.687867 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.687831 2573 scope.go:117] "RemoveContainer" containerID="08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586" Apr 17 08:42:17.688063 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:42:17.688046 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586\": container with ID starting with 08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586 not found: ID does not exist" containerID="08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586" Apr 17 08:42:17.688157 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.688065 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586"} err="failed to get container status \"08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586\": rpc error: code = NotFound desc = could not find container \"08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586\": container with ID starting with 08497e822c08fbee5053d6ef12ecd1a7cb3faf868b06a61b12977804c0c4d586 not found: ID does not exist" Apr 17 08:42:17.689513 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.689478 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podStartSLOduration=5.689465805 podStartE2EDuration="5.689465805s" podCreationTimestamp="2026-04-17 08:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:42:17.68860014 +0000 UTC m=+3065.031851604" watchObservedRunningTime="2026-04-17 08:42:17.689465805 +0000 UTC m=+3065.032717267" Apr 17 08:42:17.694897 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.694882 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c3df37-1a9d-4bb7-b041-25241af62213-kserve-provision-location\") pod \"79c3df37-1a9d-4bb7-b041-25241af62213\" (UID: \"79c3df37-1a9d-4bb7-b041-25241af62213\") " Apr 17 08:42:17.695170 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.695147 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c3df37-1a9d-4bb7-b041-25241af62213-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "79c3df37-1a9d-4bb7-b041-25241af62213" (UID: "79c3df37-1a9d-4bb7-b041-25241af62213"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:42:17.795311 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.795260 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c3df37-1a9d-4bb7-b041-25241af62213-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:42:17.992055 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.992029 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl"] Apr 17 08:42:17.995530 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:17.995511 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-464wl"] Apr 17 08:42:18.677678 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:18.677644 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:42:19.163601 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:19.163568 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" path="/var/lib/kubelet/pods/79c3df37-1a9d-4bb7-b041-25241af62213/volumes" Apr 17 08:42:28.678624 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:28.678580 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:42:38.678278 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:38.678238 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:42:48.678138 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:48.678095 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:42:58.677793 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:42:58.677754 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:43:08.678201 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:08.678163 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 17 08:43:18.679238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:18.679196 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:43:22.260026 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.259994 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh"] Apr 17 08:43:22.260546 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.260344 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" containerID="cri-o://54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e" gracePeriod=30 Apr 17 08:43:22.321944 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.321916 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7"] Apr 17 08:43:22.322222 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.322209 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" containerName="kserve-container" Apr 17 08:43:22.322322 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.322224 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" containerName="kserve-container" Apr 17 08:43:22.322322 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.322259 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" containerName="storage-initializer" Apr 17 08:43:22.322322 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.322266 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" containerName="storage-initializer" Apr 17 08:43:22.322443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.322324 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="79c3df37-1a9d-4bb7-b041-25241af62213" containerName="kserve-container" Apr 17 08:43:22.326367 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.326337 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:43:22.332154 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.332132 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7"] Apr 17 08:43:22.417597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.417566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb513c4e-8ba8-42dd-879a-b89db36c9d5c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7\" (UID: \"eb513c4e-8ba8-42dd-879a-b89db36c9d5c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:43:22.518431 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.518338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb513c4e-8ba8-42dd-879a-b89db36c9d5c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7\" (UID: \"eb513c4e-8ba8-42dd-879a-b89db36c9d5c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:43:22.518704 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.518686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb513c4e-8ba8-42dd-879a-b89db36c9d5c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7\" (UID: \"eb513c4e-8ba8-42dd-879a-b89db36c9d5c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:43:22.637518 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.637467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:43:22.752541 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.752511 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7"] Apr 17 08:43:22.755598 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:43:22.755569 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb513c4e_8ba8_42dd_879a_b89db36c9d5c.slice/crio-56fac8d2196c6dc742265a46fcc063484d2249080a0881ee8eca3ecb86f2c6d6 WatchSource:0}: Error finding container 56fac8d2196c6dc742265a46fcc063484d2249080a0881ee8eca3ecb86f2c6d6: Status 404 returned error can't find the container with id 56fac8d2196c6dc742265a46fcc063484d2249080a0881ee8eca3ecb86f2c6d6 Apr 17 08:43:22.840670 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.840639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" event={"ID":"eb513c4e-8ba8-42dd-879a-b89db36c9d5c","Type":"ContainerStarted","Data":"05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac"} Apr 17 08:43:22.840776 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:22.840677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" event={"ID":"eb513c4e-8ba8-42dd-879a-b89db36c9d5c","Type":"ContainerStarted","Data":"56fac8d2196c6dc742265a46fcc063484d2249080a0881ee8eca3ecb86f2c6d6"} Apr 17 08:43:25.588544 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.588524 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:43:25.642505 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.642485 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0256e8c6-c5fc-4b24-93e8-1a44c776a006-kserve-provision-location\") pod \"0256e8c6-c5fc-4b24-93e8-1a44c776a006\" (UID: \"0256e8c6-c5fc-4b24-93e8-1a44c776a006\") " Apr 17 08:43:25.642772 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.642752 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0256e8c6-c5fc-4b24-93e8-1a44c776a006-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0256e8c6-c5fc-4b24-93e8-1a44c776a006" (UID: "0256e8c6-c5fc-4b24-93e8-1a44c776a006"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:43:25.743009 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.742980 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0256e8c6-c5fc-4b24-93e8-1a44c776a006-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:43:25.849613 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.849578 2573 generic.go:358] "Generic (PLEG): container finished" podID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerID="54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e" exitCode=0 Apr 17 08:43:25.849758 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.849645 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" Apr 17 08:43:25.849758 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.849656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" event={"ID":"0256e8c6-c5fc-4b24-93e8-1a44c776a006","Type":"ContainerDied","Data":"54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e"} Apr 17 08:43:25.849758 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.849688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh" event={"ID":"0256e8c6-c5fc-4b24-93e8-1a44c776a006","Type":"ContainerDied","Data":"6c69617dc54c681c45b3e6f5ae5c60c542f6c13454d1cbb67091871184605d74"} Apr 17 08:43:25.849758 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.849706 2573 scope.go:117] "RemoveContainer" containerID="54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e" Apr 17 08:43:25.857506 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.857485 2573 scope.go:117] "RemoveContainer" containerID="90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32" Apr 17 08:43:25.864263 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.864238 2573 scope.go:117] "RemoveContainer" containerID="54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e" Apr 17 08:43:25.864520 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:43:25.864503 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e\": container with ID starting with 54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e not found: ID does not exist" containerID="54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e" Apr 17 08:43:25.864607 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.864532 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e"} err="failed to get container status \"54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e\": rpc error: code = NotFound desc = could not find container \"54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e\": container with ID starting with 54018810e6c4870024640465857f2fdad6d0da061aa96198b917a5577631251e not found: ID does not exist" Apr 17 08:43:25.864607 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.864555 2573 scope.go:117] "RemoveContainer" containerID="90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32" Apr 17 08:43:25.864807 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:43:25.864791 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32\": container with ID starting with 90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32 not found: ID does not exist" containerID="90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32" Apr 17 08:43:25.864847 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.864814 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32"} err="failed to get container status \"90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32\": rpc error: code = NotFound desc = could not find container \"90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32\": container with ID starting with 90baaea428cd43a0e489d664835871822484377b6d351ff0a46fe9c0838f0c32 not found: ID does not exist" Apr 17 08:43:25.869593 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.869575 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh"] Apr 17 08:43:25.872107 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:25.872087 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-hk2fh"] Apr 17 08:43:26.854417 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:26.854314 2573 generic.go:358] "Generic (PLEG): container finished" podID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerID="05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac" exitCode=0 Apr 17 08:43:26.854417 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:26.854361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" event={"ID":"eb513c4e-8ba8-42dd-879a-b89db36c9d5c","Type":"ContainerDied","Data":"05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac"} Apr 17 08:43:27.163109 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:27.163034 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" path="/var/lib/kubelet/pods/0256e8c6-c5fc-4b24-93e8-1a44c776a006/volumes" Apr 17 08:43:27.858202 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:27.858169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" event={"ID":"eb513c4e-8ba8-42dd-879a-b89db36c9d5c","Type":"ContainerStarted","Data":"06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005"} Apr 17 08:43:27.858595 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:27.858395 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:43:27.873536 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:27.873486 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" podStartSLOduration=5.873469247 podStartE2EDuration="5.873469247s" podCreationTimestamp="2026-04-17 08:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:43:27.872912886 +0000 UTC m=+3135.216164371" watchObservedRunningTime="2026-04-17 08:43:27.873469247 +0000 UTC m=+3135.216720713" Apr 17 08:43:58.903333 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:43:58.903247 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 17 08:44:08.868237 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:08.868197 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:44:12.433993 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.433957 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7"] Apr 17 08:44:12.434490 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.434330 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="kserve-container" containerID="cri-o://06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005" gracePeriod=30 Apr 17 08:44:12.478925 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.478895 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95"] Apr 17 08:44:12.479162 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.479151 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" Apr 17 08:44:12.479209 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.479165 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" Apr 17 08:44:12.479209 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.479179 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="storage-initializer" Apr 17 08:44:12.479209 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.479184 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="storage-initializer" Apr 17 08:44:12.479301 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.479228 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0256e8c6-c5fc-4b24-93e8-1a44c776a006" containerName="kserve-container" Apr 17 08:44:12.482270 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.482251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:44:12.489399 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.489357 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95"] Apr 17 08:44:12.558892 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.558857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a20618-b236-4a88-ad95-dbf920587322-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-dtc95\" (UID: \"34a20618-b236-4a88-ad95-dbf920587322\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:44:12.660192 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.660156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a20618-b236-4a88-ad95-dbf920587322-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-dtc95\" (UID: \"34a20618-b236-4a88-ad95-dbf920587322\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:44:12.660512 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.660497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a20618-b236-4a88-ad95-dbf920587322-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-dtc95\" (UID: \"34a20618-b236-4a88-ad95-dbf920587322\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:44:12.792871 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.792778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:44:12.909898 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.909863 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95"] Apr 17 08:44:12.913056 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:44:12.913027 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a20618_b236_4a88_ad95_dbf920587322.slice/crio-38101d4fe3ed6daa19f105f658a29c78c2832ca9decf83d9c9b84c1ea5477a7c WatchSource:0}: Error finding container 38101d4fe3ed6daa19f105f658a29c78c2832ca9decf83d9c9b84c1ea5477a7c: Status 404 returned error can't find the container with id 38101d4fe3ed6daa19f105f658a29c78c2832ca9decf83d9c9b84c1ea5477a7c Apr 17 08:44:12.980771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.980743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" event={"ID":"34a20618-b236-4a88-ad95-dbf920587322","Type":"ContainerStarted","Data":"aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891"} Apr 17 08:44:12.980910 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:12.980776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" event={"ID":"34a20618-b236-4a88-ad95-dbf920587322","Type":"ContainerStarted","Data":"38101d4fe3ed6daa19f105f658a29c78c2832ca9decf83d9c9b84c1ea5477a7c"} Apr 17 08:44:16.992760 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:16.992730 2573 generic.go:358] "Generic (PLEG): container finished" podID="34a20618-b236-4a88-ad95-dbf920587322" containerID="aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891" exitCode=0 Apr 17 08:44:16.993131 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:16.992797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" event={"ID":"34a20618-b236-4a88-ad95-dbf920587322","Type":"ContainerDied","Data":"aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891"} Apr 17 08:44:17.996870 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:17.996833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" event={"ID":"34a20618-b236-4a88-ad95-dbf920587322","Type":"ContainerStarted","Data":"4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f"} Apr 17 08:44:17.997301 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:17.997185 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:44:17.998439 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:17.998410 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:44:18.019595 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:18.019554 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podStartSLOduration=6.0195429 podStartE2EDuration="6.0195429s" podCreationTimestamp="2026-04-17 08:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:44:18.017984119 +0000 UTC m=+3185.361235581" watchObservedRunningTime="2026-04-17 08:44:18.0195429 +0000 UTC m=+3185.362794362" Apr 17 08:44:18.867009 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:18.866966 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.53:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.132.0.53:8080: connect: connection refused" Apr 17 08:44:18.999298 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:18.999259 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:44:19.568854 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:19.568834 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:44:19.707871 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:19.707847 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb513c4e-8ba8-42dd-879a-b89db36c9d5c-kserve-provision-location\") pod \"eb513c4e-8ba8-42dd-879a-b89db36c9d5c\" (UID: \"eb513c4e-8ba8-42dd-879a-b89db36c9d5c\") " Apr 17 08:44:19.708232 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:19.708205 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb513c4e-8ba8-42dd-879a-b89db36c9d5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eb513c4e-8ba8-42dd-879a-b89db36c9d5c" (UID: "eb513c4e-8ba8-42dd-879a-b89db36c9d5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:44:19.808840 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:19.808814 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb513c4e-8ba8-42dd-879a-b89db36c9d5c-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:44:20.002629 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.002551 2573 generic.go:358] "Generic (PLEG): container finished" podID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerID="06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005" exitCode=0 Apr 17 08:44:20.002629 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.002622 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" Apr 17 08:44:20.003039 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.002640 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" event={"ID":"eb513c4e-8ba8-42dd-879a-b89db36c9d5c","Type":"ContainerDied","Data":"06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005"} Apr 17 08:44:20.003039 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.002679 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7" event={"ID":"eb513c4e-8ba8-42dd-879a-b89db36c9d5c","Type":"ContainerDied","Data":"56fac8d2196c6dc742265a46fcc063484d2249080a0881ee8eca3ecb86f2c6d6"} Apr 17 08:44:20.003039 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.002695 2573 scope.go:117] "RemoveContainer" containerID="06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005" Apr 17 08:44:20.011396 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.011363 2573 scope.go:117] "RemoveContainer" containerID="05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac" Apr 17 08:44:20.018115 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.018100 2573 scope.go:117] "RemoveContainer" containerID="06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005" Apr 17 08:44:20.018372 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:44:20.018352 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005\": container with ID starting with 06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005 not found: ID does not exist" containerID="06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005" Apr 17 08:44:20.018442 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.018400 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005"} err="failed to get container status \"06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005\": rpc error: code = NotFound desc = could not find container \"06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005\": container with ID starting with 06f04b4c334403d639dbdd58af4f0de72b8cbeced38ab8747a64b0e0d9568005 not found: ID does not exist" Apr 17 08:44:20.018442 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.018424 2573 scope.go:117] "RemoveContainer" containerID="05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac" Apr 17 08:44:20.018670 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:44:20.018650 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac\": container with ID starting with 05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac not found: ID does not exist" containerID="05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac" Apr 17 08:44:20.018713 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.018676 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac"} err="failed to get container status \"05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac\": rpc error: code = NotFound desc = could not find container \"05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac\": container with ID starting with 05ff9afdb4495c26c74b100d4a5c38a1e6dd962262f9a2c6af5dd9668e0410ac not found: ID does not exist" Apr 17 08:44:20.023087 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.023067 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7"] Apr 17 08:44:20.026561 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:20.026540 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-k8cm7"] Apr 17 08:44:21.163347 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:21.163315 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" path="/var/lib/kubelet/pods/eb513c4e-8ba8-42dd-879a-b89db36c9d5c/volumes" Apr 17 08:44:28.999722 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:28.999638 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:44:39.000186 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:39.000146 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:44:49.000061 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:49.000011 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:44:58.999490 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:44:58.999447 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:45:08.999503 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:08.999408 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 17 08:45:19.000574 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:19.000539 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:45:22.636023 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.635987 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95"] Apr 17 08:45:22.636721 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.636227 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" containerID="cri-o://4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f" gracePeriod=30 Apr 17 08:45:22.686244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.686219 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b"] Apr 17 08:45:22.686522 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.686508 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="kserve-container" Apr 17 08:45:22.686572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.686524 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="kserve-container" Apr 17 08:45:22.686572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.686544 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="storage-initializer" Apr 17 08:45:22.686572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.686550 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="storage-initializer" Apr 17 08:45:22.686669 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.686590 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb513c4e-8ba8-42dd-879a-b89db36c9d5c" containerName="kserve-container" Apr 17 08:45:22.690491 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.690474 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:45:22.692964 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.692947 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 17 08:45:22.696825 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.696804 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b"] Apr 17 08:45:22.718752 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.718729 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e279e6-390a-40f2-bec4-a82a45c1490d-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-85d8c7694d-lhh8b\" (UID: \"54e279e6-390a-40f2-bec4-a82a45c1490d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:45:22.820072 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.820047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e279e6-390a-40f2-bec4-a82a45c1490d-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-85d8c7694d-lhh8b\" (UID: \"54e279e6-390a-40f2-bec4-a82a45c1490d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:45:22.820358 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:22.820340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e279e6-390a-40f2-bec4-a82a45c1490d-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-85d8c7694d-lhh8b\" (UID: \"54e279e6-390a-40f2-bec4-a82a45c1490d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:45:23.000563 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:23.000538 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:45:23.117111 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:23.117084 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b"] Apr 17 08:45:23.119817 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:45:23.119790 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54e279e6_390a_40f2_bec4_a82a45c1490d.slice/crio-04daa1d1aeddfdf503460438e14e4cfd19211dfdc0737eb78514ab5a85119f14 WatchSource:0}: Error finding container 04daa1d1aeddfdf503460438e14e4cfd19211dfdc0737eb78514ab5a85119f14: Status 404 returned error can't find the container with id 04daa1d1aeddfdf503460438e14e4cfd19211dfdc0737eb78514ab5a85119f14 Apr 17 08:45:23.169854 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:23.169832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" event={"ID":"54e279e6-390a-40f2-bec4-a82a45c1490d","Type":"ContainerStarted","Data":"04daa1d1aeddfdf503460438e14e4cfd19211dfdc0737eb78514ab5a85119f14"} Apr 17 08:45:24.174212 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:24.174176 2573 generic.go:358] "Generic (PLEG): container finished" podID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerID="cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896" exitCode=0 Apr 17 08:45:24.174659 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:24.174260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" event={"ID":"54e279e6-390a-40f2-bec4-a82a45c1490d","Type":"ContainerDied","Data":"cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896"} Apr 17 08:45:25.177976 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:25.177946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" event={"ID":"54e279e6-390a-40f2-bec4-a82a45c1490d","Type":"ContainerStarted","Data":"119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16"} Apr 17 08:45:25.178352 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:25.178137 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:45:25.179100 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:25.179078 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:45:25.194973 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:25.194939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podStartSLOduration=3.19492922 podStartE2EDuration="3.19492922s" podCreationTimestamp="2026-04-17 08:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:45:25.193071987 +0000 UTC m=+3252.536323452" watchObservedRunningTime="2026-04-17 08:45:25.19492922 +0000 UTC m=+3252.538180683" Apr 17 08:45:25.963625 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:25.963604 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:45:26.041615 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.041593 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a20618-b236-4a88-ad95-dbf920587322-kserve-provision-location\") pod \"34a20618-b236-4a88-ad95-dbf920587322\" (UID: \"34a20618-b236-4a88-ad95-dbf920587322\") " Apr 17 08:45:26.041906 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.041882 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a20618-b236-4a88-ad95-dbf920587322-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34a20618-b236-4a88-ad95-dbf920587322" (UID: "34a20618-b236-4a88-ad95-dbf920587322"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:45:26.142856 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.142788 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a20618-b236-4a88-ad95-dbf920587322-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:45:26.182176 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.182152 2573 generic.go:358] "Generic (PLEG): container finished" podID="34a20618-b236-4a88-ad95-dbf920587322" containerID="4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f" exitCode=0 Apr 17 08:45:26.182579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.182247 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" event={"ID":"34a20618-b236-4a88-ad95-dbf920587322","Type":"ContainerDied","Data":"4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f"} Apr 17 08:45:26.182579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.182285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" event={"ID":"34a20618-b236-4a88-ad95-dbf920587322","Type":"ContainerDied","Data":"38101d4fe3ed6daa19f105f658a29c78c2832ca9decf83d9c9b84c1ea5477a7c"} Apr 17 08:45:26.182579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.182313 2573 scope.go:117] "RemoveContainer" containerID="4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f" Apr 17 08:45:26.182579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.182261 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95" Apr 17 08:45:26.182769 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.182632 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:45:26.190026 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.190004 2573 scope.go:117] "RemoveContainer" containerID="aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891" Apr 17 08:45:26.196704 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.196686 2573 scope.go:117] "RemoveContainer" containerID="4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f" Apr 17 08:45:26.196936 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:45:26.196917 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f\": container with ID starting with 4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f not found: ID does not exist" containerID="4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f" Apr 17 08:45:26.197016 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.196946 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f"} err="failed to get container status \"4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f\": rpc error: code = NotFound desc = could not find container \"4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f\": container with ID starting with 4a0ff76827818a94dae8e0f2522d4b2aab8cbe8b04baadc40079cf9a1128c41f not found: ID does not exist" Apr 17 08:45:26.197016 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.196969 2573 scope.go:117] "RemoveContainer" containerID="aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891" Apr 17 08:45:26.197193 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:45:26.197176 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891\": container with ID starting with aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891 not found: ID does not exist" containerID="aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891" Apr 17 08:45:26.197247 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.197200 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891"} err="failed to get container status \"aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891\": rpc error: code = NotFound desc = could not find container \"aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891\": container with ID starting with aace2ac40ff039ed5571fbda45145fa399e0f864a051b7e2e181751ab8573891 not found: ID does not exist" Apr 17 08:45:26.202253 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.202233 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95"] Apr 17 08:45:26.205443 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:26.205422 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-dtc95"] Apr 17 08:45:27.164582 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:27.164549 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a20618-b236-4a88-ad95-dbf920587322" path="/var/lib/kubelet/pods/34a20618-b236-4a88-ad95-dbf920587322/volumes" Apr 17 08:45:36.183421 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:36.183353 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:45:46.183604 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:46.183561 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:45:56.182983 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:45:56.182937 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:46:06.183550 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:06.183509 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:46:16.183370 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:16.183329 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:46:26.183594 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:26.183556 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:46:32.800730 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.800636 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b"] Apr 17 08:46:32.801254 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.800994 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" containerID="cri-o://119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16" gracePeriod=30 Apr 17 08:46:32.909259 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.909223 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p"] Apr 17 08:46:32.909541 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.909525 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" Apr 17 08:46:32.909597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.909543 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" Apr 17 08:46:32.909597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.909551 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="storage-initializer" Apr 17 08:46:32.909597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.909556 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="storage-initializer" Apr 17 08:46:32.909701 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.909608 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="34a20618-b236-4a88-ad95-dbf920587322" containerName="kserve-container" Apr 17 08:46:32.912431 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.912415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:32.914746 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.914727 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 08:46:32.921624 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.921601 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p"] Apr 17 08:46:32.978713 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.978686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18bc96eb-9a3f-4f7c-bc98-171716364765-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:32.978817 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:32.978727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18bc96eb-9a3f-4f7c-bc98-171716364765-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:33.080014 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.079947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18bc96eb-9a3f-4f7c-bc98-171716364765-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:33.080014 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.079998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18bc96eb-9a3f-4f7c-bc98-171716364765-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:33.080315 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.080295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18bc96eb-9a3f-4f7c-bc98-171716364765-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:33.080724 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.080708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18bc96eb-9a3f-4f7c-bc98-171716364765-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:33.223415 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.223373 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:33.342450 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.342427 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p"] Apr 17 08:46:33.345046 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:46:33.345017 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18bc96eb_9a3f_4f7c_bc98_171716364765.slice/crio-2bbc2c9401c89b667a53071d74d2606f1f0fde0d7d8507bb5ee2d025e3663075 WatchSource:0}: Error finding container 2bbc2c9401c89b667a53071d74d2606f1f0fde0d7d8507bb5ee2d025e3663075: Status 404 returned error can't find the container with id 2bbc2c9401c89b667a53071d74d2606f1f0fde0d7d8507bb5ee2d025e3663075 Apr 17 08:46:33.346930 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.346908 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:46:33.358202 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:33.358176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" event={"ID":"18bc96eb-9a3f-4f7c-bc98-171716364765","Type":"ContainerStarted","Data":"2bbc2c9401c89b667a53071d74d2606f1f0fde0d7d8507bb5ee2d025e3663075"} Apr 17 08:46:34.362571 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:34.362538 2573 generic.go:358] "Generic (PLEG): container finished" podID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerID="45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b" exitCode=0 Apr 17 08:46:34.362922 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:34.362624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" event={"ID":"18bc96eb-9a3f-4f7c-bc98-171716364765","Type":"ContainerDied","Data":"45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b"} Apr 17 08:46:35.366806 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:35.366767 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" event={"ID":"18bc96eb-9a3f-4f7c-bc98-171716364765","Type":"ContainerStarted","Data":"350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602"} Apr 17 08:46:35.367215 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:35.367011 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:46:35.368092 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:35.368064 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:46:35.383572 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:35.383538 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podStartSLOduration=3.383527775 podStartE2EDuration="3.383527775s" podCreationTimestamp="2026-04-17 08:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:46:35.382603539 +0000 UTC m=+3322.725855002" watchObservedRunningTime="2026-04-17 08:46:35.383527775 +0000 UTC m=+3322.726779238" Apr 17 08:46:36.182907 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:36.182864 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 17 08:46:36.370359 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:36.370323 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:46:36.643968 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:36.643947 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:46:36.705021 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:36.704989 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e279e6-390a-40f2-bec4-a82a45c1490d-kserve-provision-location\") pod \"54e279e6-390a-40f2-bec4-a82a45c1490d\" (UID: \"54e279e6-390a-40f2-bec4-a82a45c1490d\") " Apr 17 08:46:36.705274 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:36.705252 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e279e6-390a-40f2-bec4-a82a45c1490d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54e279e6-390a-40f2-bec4-a82a45c1490d" (UID: "54e279e6-390a-40f2-bec4-a82a45c1490d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:46:36.805476 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:36.805425 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e279e6-390a-40f2-bec4-a82a45c1490d-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:46:37.374618 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.374587 2573 generic.go:358] "Generic (PLEG): container finished" podID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerID="119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16" exitCode=0 Apr 17 08:46:37.374993 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.374637 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" event={"ID":"54e279e6-390a-40f2-bec4-a82a45c1490d","Type":"ContainerDied","Data":"119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16"} Apr 17 08:46:37.374993 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.374658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" event={"ID":"54e279e6-390a-40f2-bec4-a82a45c1490d","Type":"ContainerDied","Data":"04daa1d1aeddfdf503460438e14e4cfd19211dfdc0737eb78514ab5a85119f14"} Apr 17 08:46:37.374993 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.374663 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b" Apr 17 08:46:37.374993 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.374674 2573 scope.go:117] "RemoveContainer" containerID="119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16" Apr 17 08:46:37.382205 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.382189 2573 scope.go:117] "RemoveContainer" containerID="cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896" Apr 17 08:46:37.388788 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.388771 2573 scope.go:117] "RemoveContainer" containerID="119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16" Apr 17 08:46:37.389038 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:46:37.389015 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16\": container with ID starting with 119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16 not found: ID does not exist" containerID="119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16" Apr 17 08:46:37.389133 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.389053 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16"} err="failed to get container status \"119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16\": rpc error: code = NotFound desc = could not find container \"119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16\": container with ID starting with 119d4c172a1603238a1fafa1e8c2fd45c7edaa7aee6bec864f4cd28aa5855b16 not found: ID does not exist" Apr 17 08:46:37.389133 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.389118 2573 scope.go:117] "RemoveContainer" containerID="cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896" Apr 17 08:46:37.389581 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:46:37.389536 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896\": container with ID starting with cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896 not found: ID does not exist" containerID="cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896" Apr 17 08:46:37.389581 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.389569 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896"} err="failed to get container status \"cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896\": rpc error: code = NotFound desc = could not find container \"cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896\": container with ID starting with cfbaf45324d11a00a4a47f3ed97108bbf1273cc979c1eff52daace7ff1523896 not found: ID does not exist" Apr 17 08:46:37.390815 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.390797 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b"] Apr 17 08:46:37.393687 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:37.393667 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-lhh8b"] Apr 17 08:46:39.163557 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:39.163524 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" path="/var/lib/kubelet/pods/54e279e6-390a-40f2-bec4-a82a45c1490d/volumes" Apr 17 08:46:46.370997 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:46.370954 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:46:56.370953 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:46:56.370913 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:47:06.371171 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:06.371132 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:47:16.371075 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:16.371034 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:47:26.370914 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:26.370867 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:47:36.371398 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:36.371349 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:47:42.943977 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:42.943946 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p"] Apr 17 08:47:42.946407 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:42.944197 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" containerID="cri-o://350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602" gracePeriod=30 Apr 17 08:47:44.010140 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.010111 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd"] Apr 17 08:47:44.010717 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.010352 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="storage-initializer" Apr 17 08:47:44.010717 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.010363 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="storage-initializer" Apr 17 08:47:44.010717 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.010398 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" Apr 17 08:47:44.010717 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.010404 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" Apr 17 08:47:44.010717 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.010444 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="54e279e6-390a-40f2-bec4-a82a45c1490d" containerName="kserve-container" Apr 17 08:47:44.013231 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.013212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:44.022418 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.021832 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd"] Apr 17 08:47:44.050494 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.050469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe16610c-d68d-4c73-85ae-7297bf6e2f15-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd\" (UID: \"fe16610c-d68d-4c73-85ae-7297bf6e2f15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:44.151711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.151686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe16610c-d68d-4c73-85ae-7297bf6e2f15-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd\" (UID: \"fe16610c-d68d-4c73-85ae-7297bf6e2f15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:44.152032 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.152015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe16610c-d68d-4c73-85ae-7297bf6e2f15-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd\" (UID: \"fe16610c-d68d-4c73-85ae-7297bf6e2f15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:44.325699 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.325636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:44.439152 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.439047 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd"] Apr 17 08:47:44.441885 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:47:44.441857 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe16610c_d68d_4c73_85ae_7297bf6e2f15.slice/crio-6ad209e236ff0576133a2a5b575bccbed1f8eaec3066d6e96136e4196649edad WatchSource:0}: Error finding container 6ad209e236ff0576133a2a5b575bccbed1f8eaec3066d6e96136e4196649edad: Status 404 returned error can't find the container with id 6ad209e236ff0576133a2a5b575bccbed1f8eaec3066d6e96136e4196649edad Apr 17 08:47:44.553224 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.553174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" event={"ID":"fe16610c-d68d-4c73-85ae-7297bf6e2f15","Type":"ContainerStarted","Data":"cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752"} Apr 17 08:47:44.553224 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:44.553216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" event={"ID":"fe16610c-d68d-4c73-85ae-7297bf6e2f15","Type":"ContainerStarted","Data":"6ad209e236ff0576133a2a5b575bccbed1f8eaec3066d6e96136e4196649edad"} Apr 17 08:47:46.370849 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:46.370809 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 17 08:47:46.890096 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:46.890075 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:47:46.970538 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:46.970471 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18bc96eb-9a3f-4f7c-bc98-171716364765-cabundle-cert\") pod \"18bc96eb-9a3f-4f7c-bc98-171716364765\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " Apr 17 08:47:46.970538 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:46.970531 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18bc96eb-9a3f-4f7c-bc98-171716364765-kserve-provision-location\") pod \"18bc96eb-9a3f-4f7c-bc98-171716364765\" (UID: \"18bc96eb-9a3f-4f7c-bc98-171716364765\") " Apr 17 08:47:46.970870 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:46.970847 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bc96eb-9a3f-4f7c-bc98-171716364765-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "18bc96eb-9a3f-4f7c-bc98-171716364765" (UID: "18bc96eb-9a3f-4f7c-bc98-171716364765"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:47:46.970910 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:46.970844 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18bc96eb-9a3f-4f7c-bc98-171716364765-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "18bc96eb-9a3f-4f7c-bc98-171716364765" (UID: "18bc96eb-9a3f-4f7c-bc98-171716364765"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:47:47.070881 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.070858 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18bc96eb-9a3f-4f7c-bc98-171716364765-cabundle-cert\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:47:47.070881 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.070879 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18bc96eb-9a3f-4f7c-bc98-171716364765-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:47:47.562764 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.562733 2573 generic.go:358] "Generic (PLEG): container finished" podID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerID="350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602" exitCode=0 Apr 17 08:47:47.562764 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.562772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" event={"ID":"18bc96eb-9a3f-4f7c-bc98-171716364765","Type":"ContainerDied","Data":"350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602"} Apr 17 08:47:47.563262 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.562796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" event={"ID":"18bc96eb-9a3f-4f7c-bc98-171716364765","Type":"ContainerDied","Data":"2bbc2c9401c89b667a53071d74d2606f1f0fde0d7d8507bb5ee2d025e3663075"} Apr 17 08:47:47.563262 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.562807 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p" Apr 17 08:47:47.563262 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.562826 2573 scope.go:117] "RemoveContainer" containerID="350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602" Apr 17 08:47:47.571936 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.571919 2573 scope.go:117] "RemoveContainer" containerID="45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b" Apr 17 08:47:47.579195 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.579169 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p"] Apr 17 08:47:47.579867 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.579849 2573 scope.go:117] "RemoveContainer" containerID="350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602" Apr 17 08:47:47.580166 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:47:47.580142 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602\": container with ID starting with 350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602 not found: ID does not exist" containerID="350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602" Apr 17 08:47:47.580250 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.580174 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602"} err="failed to get container status \"350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602\": rpc error: code = NotFound desc = could not find container \"350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602\": container with ID starting with 350f5894f80e3b5d964b5033b95ffe4c2c2dac1eb3821cdf2e899ed76bf44602 not found: ID does not exist" Apr 17 08:47:47.580250 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.580194 2573 scope.go:117] "RemoveContainer" containerID="45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b" Apr 17 08:47:47.580496 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:47:47.580460 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b\": container with ID starting with 45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b not found: ID does not exist" containerID="45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b" Apr 17 08:47:47.580595 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.580494 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b"} err="failed to get container status \"45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b\": rpc error: code = NotFound desc = could not find container \"45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b\": container with ID starting with 45b54718a106e44155580d981022780563d09b0d853918a6e44f27028a6c393b not found: ID does not exist" Apr 17 08:47:47.582050 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:47.582018 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-tnv6p"] Apr 17 08:47:49.164825 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:49.164785 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" path="/var/lib/kubelet/pods/18bc96eb-9a3f-4f7c-bc98-171716364765/volumes" Apr 17 08:47:51.576423 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:51.576394 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd_fe16610c-d68d-4c73-85ae-7297bf6e2f15/storage-initializer/0.log" Apr 17 08:47:51.576770 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:51.576434 2573 generic.go:358] "Generic (PLEG): container finished" podID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerID="cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752" exitCode=1 Apr 17 08:47:51.576770 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:51.576475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" event={"ID":"fe16610c-d68d-4c73-85ae-7297bf6e2f15","Type":"ContainerDied","Data":"cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752"} Apr 17 08:47:52.580330 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:52.580303 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd_fe16610c-d68d-4c73-85ae-7297bf6e2f15/storage-initializer/0.log" Apr 17 08:47:52.580703 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:52.580419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" event={"ID":"fe16610c-d68d-4c73-85ae-7297bf6e2f15","Type":"ContainerStarted","Data":"acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176"} Apr 17 08:47:54.010013 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:54.009983 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd"] Apr 17 08:47:54.010361 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:54.010188 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" containerID="cri-o://acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176" gracePeriod=30 Apr 17 08:47:55.073338 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.073304 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb"] Apr 17 08:47:55.073718 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.073572 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="storage-initializer" Apr 17 08:47:55.073718 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.073584 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="storage-initializer" Apr 17 08:47:55.073718 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.073597 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" Apr 17 08:47:55.073718 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.073606 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" Apr 17 08:47:55.073718 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.073660 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="18bc96eb-9a3f-4f7c-bc98-171716364765" containerName="kserve-container" Apr 17 08:47:55.076392 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.076364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.078726 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.078699 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 08:47:55.086124 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.086100 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb"] Apr 17 08:47:55.121688 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.121659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2789aa-cdeb-4ac6-ad81-96637fb02726-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.121780 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.121690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5c2789aa-cdeb-4ac6-ad81-96637fb02726-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.222189 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.222166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2789aa-cdeb-4ac6-ad81-96637fb02726-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.222322 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.222200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5c2789aa-cdeb-4ac6-ad81-96637fb02726-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.222608 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.222589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2789aa-cdeb-4ac6-ad81-96637fb02726-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.222884 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.222864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5c2789aa-cdeb-4ac6-ad81-96637fb02726-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.386198 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.386131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:55.504434 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.504406 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb"] Apr 17 08:47:55.507316 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:47:55.507287 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2789aa_cdeb_4ac6_ad81_96637fb02726.slice/crio-d0997360c427a9a2d9b2569e1397b87d0910030696d8eee4a818db6d5e342679 WatchSource:0}: Error finding container d0997360c427a9a2d9b2569e1397b87d0910030696d8eee4a818db6d5e342679: Status 404 returned error can't find the container with id d0997360c427a9a2d9b2569e1397b87d0910030696d8eee4a818db6d5e342679 Apr 17 08:47:55.589429 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.589402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" event={"ID":"5c2789aa-cdeb-4ac6-ad81-96637fb02726","Type":"ContainerStarted","Data":"41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1"} Apr 17 08:47:55.589525 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:55.589440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" event={"ID":"5c2789aa-cdeb-4ac6-ad81-96637fb02726","Type":"ContainerStarted","Data":"d0997360c427a9a2d9b2569e1397b87d0910030696d8eee4a818db6d5e342679"} Apr 17 08:47:56.593822 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.593783 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerID="41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1" exitCode=0 Apr 17 08:47:56.594256 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.593865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" event={"ID":"5c2789aa-cdeb-4ac6-ad81-96637fb02726","Type":"ContainerDied","Data":"41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1"} Apr 17 08:47:56.741555 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.741533 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd_fe16610c-d68d-4c73-85ae-7297bf6e2f15/storage-initializer/1.log" Apr 17 08:47:56.741874 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.741858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd_fe16610c-d68d-4c73-85ae-7297bf6e2f15/storage-initializer/0.log" Apr 17 08:47:56.741932 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.741919 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:56.834711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.834676 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe16610c-d68d-4c73-85ae-7297bf6e2f15-kserve-provision-location\") pod \"fe16610c-d68d-4c73-85ae-7297bf6e2f15\" (UID: \"fe16610c-d68d-4c73-85ae-7297bf6e2f15\") " Apr 17 08:47:56.834912 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.834890 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe16610c-d68d-4c73-85ae-7297bf6e2f15-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe16610c-d68d-4c73-85ae-7297bf6e2f15" (UID: "fe16610c-d68d-4c73-85ae-7297bf6e2f15"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:47:56.935635 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:56.935611 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe16610c-d68d-4c73-85ae-7297bf6e2f15-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:47:57.597353 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd_fe16610c-d68d-4c73-85ae-7297bf6e2f15/storage-initializer/1.log" Apr 17 08:47:57.597727 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597673 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd_fe16610c-d68d-4c73-85ae-7297bf6e2f15/storage-initializer/0.log" Apr 17 08:47:57.597727 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597707 2573 generic.go:358] "Generic (PLEG): container finished" podID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerID="acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176" exitCode=1 Apr 17 08:47:57.597813 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597761 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" event={"ID":"fe16610c-d68d-4c73-85ae-7297bf6e2f15","Type":"ContainerDied","Data":"acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176"} Apr 17 08:47:57.597813 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597774 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" Apr 17 08:47:57.597813 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd" event={"ID":"fe16610c-d68d-4c73-85ae-7297bf6e2f15","Type":"ContainerDied","Data":"6ad209e236ff0576133a2a5b575bccbed1f8eaec3066d6e96136e4196649edad"} Apr 17 08:47:57.597813 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.597798 2573 scope.go:117] "RemoveContainer" containerID="acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176" Apr 17 08:47:57.599495 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.599470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" event={"ID":"5c2789aa-cdeb-4ac6-ad81-96637fb02726","Type":"ContainerStarted","Data":"9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3"} Apr 17 08:47:57.599838 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.599738 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:47:57.600938 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.600912 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:47:57.605994 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.605978 2573 scope.go:117] "RemoveContainer" containerID="cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752" Apr 17 08:47:57.612654 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.612637 2573 scope.go:117] "RemoveContainer" containerID="acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176" Apr 17 08:47:57.612895 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:47:57.612875 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176\": container with ID starting with acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176 not found: ID does not exist" containerID="acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176" Apr 17 08:47:57.612970 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.612906 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176"} err="failed to get container status \"acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176\": rpc error: code = NotFound desc = could not find container \"acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176\": container with ID starting with acd9ba3ad7be8a2df36967d2b073623432f2c94ed080c8c7ecbb7ce46175f176 not found: ID does not exist" Apr 17 08:47:57.612970 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.612929 2573 scope.go:117] "RemoveContainer" containerID="cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752" Apr 17 08:47:57.613152 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:47:57.613136 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752\": container with ID starting with cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752 not found: ID does not exist" containerID="cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752" Apr 17 08:47:57.613192 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.613158 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752"} err="failed to get container status \"cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752\": rpc error: code = NotFound desc = could not find container \"cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752\": container with ID starting with cdc20dd0769ebdf6bfc7716ec7f31d221ed222b578b357b3893874245dcc9752 not found: ID does not exist" Apr 17 08:47:57.623524 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.623505 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd"] Apr 17 08:47:57.629097 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.629081 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-2x4gd"] Apr 17 08:47:57.644612 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:57.644577 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podStartSLOduration=2.644567132 podStartE2EDuration="2.644567132s" podCreationTimestamp="2026-04-17 08:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:47:57.642558737 +0000 UTC m=+3404.985810211" watchObservedRunningTime="2026-04-17 08:47:57.644567132 +0000 UTC m=+3404.987818595" Apr 17 08:47:58.603868 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:58.603826 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:47:59.162978 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:47:59.162941 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" path="/var/lib/kubelet/pods/fe16610c-d68d-4c73-85ae-7297bf6e2f15/volumes" Apr 17 08:48:08.604078 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:48:08.603981 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:48:18.604352 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:48:18.604309 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:48:28.604116 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:48:28.604069 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:48:38.603943 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:48:38.603891 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:48:48.604449 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:48:48.604407 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:48:58.604757 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:48:58.604726 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:49:05.109416 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:05.109363 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb"] Apr 17 08:49:05.109774 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:05.109629 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" containerID="cri-o://9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3" gracePeriod=30 Apr 17 08:49:06.174921 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.174887 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx"] Apr 17 08:49:06.175271 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.175123 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" Apr 17 08:49:06.175271 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.175134 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" Apr 17 08:49:06.175271 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.175148 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" Apr 17 08:49:06.175271 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.175153 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" Apr 17 08:49:06.175271 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.175194 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" Apr 17 08:49:06.175271 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.175201 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe16610c-d68d-4c73-85ae-7297bf6e2f15" containerName="storage-initializer" Apr 17 08:49:06.178101 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.178082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:06.185283 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.185260 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx"] Apr 17 08:49:06.306091 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.306057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4915a068-9e86-4f1a-b0f0-545ec68f8b2d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx\" (UID: \"4915a068-9e86-4f1a-b0f0-545ec68f8b2d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:06.406689 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.406652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4915a068-9e86-4f1a-b0f0-545ec68f8b2d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx\" (UID: \"4915a068-9e86-4f1a-b0f0-545ec68f8b2d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:06.407027 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.407005 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4915a068-9e86-4f1a-b0f0-545ec68f8b2d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx\" (UID: \"4915a068-9e86-4f1a-b0f0-545ec68f8b2d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:06.488746 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.488725 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:06.604008 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.603961 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx"] Apr 17 08:49:06.606974 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:49:06.606947 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4915a068_9e86_4f1a_b0f0_545ec68f8b2d.slice/crio-a709db453114d841d23a0030a4e6688ec42374495483695a21223295661c615b WatchSource:0}: Error finding container a709db453114d841d23a0030a4e6688ec42374495483695a21223295661c615b: Status 404 returned error can't find the container with id a709db453114d841d23a0030a4e6688ec42374495483695a21223295661c615b Apr 17 08:49:06.782889 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.782814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" event={"ID":"4915a068-9e86-4f1a-b0f0-545ec68f8b2d","Type":"ContainerStarted","Data":"ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015"} Apr 17 08:49:06.782889 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:06.782852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" event={"ID":"4915a068-9e86-4f1a-b0f0-545ec68f8b2d","Type":"ContainerStarted","Data":"a709db453114d841d23a0030a4e6688ec42374495483695a21223295661c615b"} Apr 17 08:49:08.604213 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:08.604163 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 17 08:49:08.935402 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:08.935365 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:49:09.024884 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.024857 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2789aa-cdeb-4ac6-ad81-96637fb02726-kserve-provision-location\") pod \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " Apr 17 08:49:09.025001 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.024930 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5c2789aa-cdeb-4ac6-ad81-96637fb02726-cabundle-cert\") pod \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\" (UID: \"5c2789aa-cdeb-4ac6-ad81-96637fb02726\") " Apr 17 08:49:09.025137 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.025114 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2789aa-cdeb-4ac6-ad81-96637fb02726-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c2789aa-cdeb-4ac6-ad81-96637fb02726" (UID: "5c2789aa-cdeb-4ac6-ad81-96637fb02726"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:49:09.025274 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.025255 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2789aa-cdeb-4ac6-ad81-96637fb02726-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5c2789aa-cdeb-4ac6-ad81-96637fb02726" (UID: "5c2789aa-cdeb-4ac6-ad81-96637fb02726"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:49:09.125919 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.125846 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5c2789aa-cdeb-4ac6-ad81-96637fb02726-cabundle-cert\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:49:09.125919 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.125868 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2789aa-cdeb-4ac6-ad81-96637fb02726-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:49:09.792008 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.791974 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerID="9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3" exitCode=0 Apr 17 08:49:09.792496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.792048 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" Apr 17 08:49:09.792496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.792076 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" event={"ID":"5c2789aa-cdeb-4ac6-ad81-96637fb02726","Type":"ContainerDied","Data":"9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3"} Apr 17 08:49:09.792496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.792133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb" event={"ID":"5c2789aa-cdeb-4ac6-ad81-96637fb02726","Type":"ContainerDied","Data":"d0997360c427a9a2d9b2569e1397b87d0910030696d8eee4a818db6d5e342679"} Apr 17 08:49:09.792496 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.792161 2573 scope.go:117] "RemoveContainer" containerID="9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3" Apr 17 08:49:09.800614 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.800576 2573 scope.go:117] "RemoveContainer" containerID="41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1" Apr 17 08:49:09.808650 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.808612 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb"] Apr 17 08:49:09.811256 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.811228 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-58gnb"] Apr 17 08:49:09.812583 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.812566 2573 scope.go:117] "RemoveContainer" containerID="9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3" Apr 17 08:49:09.812843 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:49:09.812825 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3\": container with ID starting with 9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3 not found: ID does not exist" containerID="9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3" Apr 17 08:49:09.812890 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.812851 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3"} err="failed to get container status \"9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3\": rpc error: code = NotFound desc = could not find container \"9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3\": container with ID starting with 9c68d3ce34ab18bcf5db6963129eb74cb3a9821e4949c5d63fcd2d2d5a2216b3 not found: ID does not exist" Apr 17 08:49:09.812890 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.812869 2573 scope.go:117] "RemoveContainer" containerID="41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1" Apr 17 08:49:09.813103 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:49:09.813084 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1\": container with ID starting with 41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1 not found: ID does not exist" containerID="41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1" Apr 17 08:49:09.813150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:09.813106 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1"} err="failed to get container status \"41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1\": rpc error: code = NotFound desc = could not find container \"41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1\": container with ID starting with 41f50ac695c8ec28492ae7b7569865b8f53a65554bb98aad38cc596b173737d1 not found: ID does not exist" Apr 17 08:49:11.164150 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:11.164112 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" path="/var/lib/kubelet/pods/5c2789aa-cdeb-4ac6-ad81-96637fb02726/volumes" Apr 17 08:49:13.809430 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:13.809401 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx_4915a068-9e86-4f1a-b0f0-545ec68f8b2d/storage-initializer/0.log" Apr 17 08:49:13.809907 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:13.809437 2573 generic.go:358] "Generic (PLEG): container finished" podID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerID="ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015" exitCode=1 Apr 17 08:49:13.809907 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:13.809496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" event={"ID":"4915a068-9e86-4f1a-b0f0-545ec68f8b2d","Type":"ContainerDied","Data":"ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015"} Apr 17 08:49:14.813900 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:14.813870 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx_4915a068-9e86-4f1a-b0f0-545ec68f8b2d/storage-initializer/0.log" Apr 17 08:49:14.814275 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:14.813959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" event={"ID":"4915a068-9e86-4f1a-b0f0-545ec68f8b2d","Type":"ContainerStarted","Data":"aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d"} Apr 17 08:49:16.190800 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:16.190767 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx"] Apr 17 08:49:16.191259 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:16.191065 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" containerID="cri-o://aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d" gracePeriod=30 Apr 17 08:49:17.124471 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.124451 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx_4915a068-9e86-4f1a-b0f0-545ec68f8b2d/storage-initializer/1.log" Apr 17 08:49:17.124803 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.124790 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx_4915a068-9e86-4f1a-b0f0-545ec68f8b2d/storage-initializer/0.log" Apr 17 08:49:17.124863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.124848 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:17.236842 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.236815 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn"] Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237030 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237041 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237052 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237058 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237064 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237071 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237085 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237090 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237149 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" Apr 17 08:49:17.237185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237157 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c2789aa-cdeb-4ac6-ad81-96637fb02726" containerName="kserve-container" Apr 17 08:49:17.237517 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.237231 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerName="storage-initializer" Apr 17 08:49:17.239967 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.239950 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.242562 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.242541 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 08:49:17.250122 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.250098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn"] Apr 17 08:49:17.285174 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.285152 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4915a068-9e86-4f1a-b0f0-545ec68f8b2d-kserve-provision-location\") pod \"4915a068-9e86-4f1a-b0f0-545ec68f8b2d\" (UID: \"4915a068-9e86-4f1a-b0f0-545ec68f8b2d\") " Apr 17 08:49:17.285415 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.285370 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4915a068-9e86-4f1a-b0f0-545ec68f8b2d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4915a068-9e86-4f1a-b0f0-545ec68f8b2d" (UID: "4915a068-9e86-4f1a-b0f0-545ec68f8b2d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:49:17.385809 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.385779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d0aafed-8821-4f11-ada0-9c7678f914d2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.385921 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.385849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6d0aafed-8821-4f11-ada0-9c7678f914d2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.385921 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.385897 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4915a068-9e86-4f1a-b0f0-545ec68f8b2d-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:49:17.487052 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.486992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6d0aafed-8821-4f11-ada0-9c7678f914d2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.487052 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.487038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d0aafed-8821-4f11-ada0-9c7678f914d2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.487366 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.487350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d0aafed-8821-4f11-ada0-9c7678f914d2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.487610 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.487592 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6d0aafed-8821-4f11-ada0-9c7678f914d2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.549213 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.549195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:17.663120 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.663093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn"] Apr 17 08:49:17.665551 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:49:17.665525 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d0aafed_8821_4f11_ada0_9c7678f914d2.slice/crio-a529c1495a475f30c86e120c19c8f9ec5e1ca850b8163971267804811e1d5291 WatchSource:0}: Error finding container a529c1495a475f30c86e120c19c8f9ec5e1ca850b8163971267804811e1d5291: Status 404 returned error can't find the container with id a529c1495a475f30c86e120c19c8f9ec5e1ca850b8163971267804811e1d5291 Apr 17 08:49:17.823791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.823705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" event={"ID":"6d0aafed-8821-4f11-ada0-9c7678f914d2","Type":"ContainerStarted","Data":"9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279"} Apr 17 08:49:17.823791 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.823744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" event={"ID":"6d0aafed-8821-4f11-ada0-9c7678f914d2","Type":"ContainerStarted","Data":"a529c1495a475f30c86e120c19c8f9ec5e1ca850b8163971267804811e1d5291"} Apr 17 08:49:17.824872 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.824853 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx_4915a068-9e86-4f1a-b0f0-545ec68f8b2d/storage-initializer/1.log" Apr 17 08:49:17.825178 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.825164 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx_4915a068-9e86-4f1a-b0f0-545ec68f8b2d/storage-initializer/0.log" Apr 17 08:49:17.825237 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.825196 2573 generic.go:358] "Generic (PLEG): container finished" podID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" containerID="aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d" exitCode=1 Apr 17 08:49:17.825301 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.825262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" event={"ID":"4915a068-9e86-4f1a-b0f0-545ec68f8b2d","Type":"ContainerDied","Data":"aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d"} Apr 17 08:49:17.825301 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.825279 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" Apr 17 08:49:17.825439 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.825306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx" event={"ID":"4915a068-9e86-4f1a-b0f0-545ec68f8b2d","Type":"ContainerDied","Data":"a709db453114d841d23a0030a4e6688ec42374495483695a21223295661c615b"} Apr 17 08:49:17.825439 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.825330 2573 scope.go:117] "RemoveContainer" containerID="aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d" Apr 17 08:49:17.832263 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.832247 2573 scope.go:117] "RemoveContainer" containerID="ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015" Apr 17 08:49:17.840920 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.840903 2573 scope.go:117] "RemoveContainer" containerID="aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d" Apr 17 08:49:17.841175 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:49:17.841155 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d\": container with ID starting with aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d not found: ID does not exist" containerID="aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d" Apr 17 08:49:17.841238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.841183 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d"} err="failed to get container status \"aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d\": rpc error: code = NotFound desc = could not find container \"aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d\": container with ID starting with aa10a2cbb317ca20bd2efea849e94453ef27307afe4ce4225022da72c1f0c47d not found: ID does not exist" Apr 17 08:49:17.841238 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.841201 2573 scope.go:117] "RemoveContainer" containerID="ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015" Apr 17 08:49:17.841456 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:49:17.841438 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015\": container with ID starting with ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015 not found: ID does not exist" containerID="ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015" Apr 17 08:49:17.841498 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.841463 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015"} err="failed to get container status \"ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015\": rpc error: code = NotFound desc = could not find container \"ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015\": container with ID starting with ebfbe21ef1b278b3dcac92852423cebab52e2b3a9422c237a86734914a7d5015 not found: ID does not exist" Apr 17 08:49:17.861732 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.861709 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx"] Apr 17 08:49:17.865365 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:17.865331 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-t87vx"] Apr 17 08:49:18.830492 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:18.830398 2573 generic.go:358] "Generic (PLEG): container finished" podID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerID="9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279" exitCode=0 Apr 17 08:49:18.830919 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:18.830487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" event={"ID":"6d0aafed-8821-4f11-ada0-9c7678f914d2","Type":"ContainerDied","Data":"9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279"} Apr 17 08:49:19.163900 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:19.163830 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4915a068-9e86-4f1a-b0f0-545ec68f8b2d" path="/var/lib/kubelet/pods/4915a068-9e86-4f1a-b0f0-545ec68f8b2d/volumes" Apr 17 08:49:19.835061 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:19.835033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" event={"ID":"6d0aafed-8821-4f11-ada0-9c7678f914d2","Type":"ContainerStarted","Data":"9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12"} Apr 17 08:49:19.835522 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:19.835235 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:49:19.836298 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:19.836250 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:49:19.850649 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:19.850601 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podStartSLOduration=2.8505887850000002 podStartE2EDuration="2.850588785s" podCreationTimestamp="2026-04-17 08:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:49:19.849217914 +0000 UTC m=+3487.192469377" watchObservedRunningTime="2026-04-17 08:49:19.850588785 +0000 UTC m=+3487.193840247" Apr 17 08:49:20.837872 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:20.837830 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:49:30.837823 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:30.837778 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:49:40.838040 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:40.837992 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:49:50.837924 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:49:50.837886 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:50:00.838217 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:00.838178 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:50:10.837924 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:10.837884 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:50:20.839314 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:20.839286 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:50:27.267010 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:27.266978 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn"] Apr 17 08:50:27.267577 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:27.267310 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" containerID="cri-o://9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12" gracePeriod=30 Apr 17 08:50:28.341333 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.341304 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv"] Apr 17 08:50:28.344274 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.344255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:28.352249 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.352227 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv"] Apr 17 08:50:28.472959 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.472925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv\" (UID: \"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:28.573462 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.573437 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv\" (UID: \"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:28.573773 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.573756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv\" (UID: \"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:28.654261 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.654194 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:28.777904 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:28.777877 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv"] Apr 17 08:50:28.781999 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:50:28.781215 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b3ea2d2_3ad0_4a4e_96c8_5c08e60da024.slice/crio-6877eadc12ecc7c279fff70e80968c284ed8bb4918c4fd295fbce4ca4df8c6b7 WatchSource:0}: Error finding container 6877eadc12ecc7c279fff70e80968c284ed8bb4918c4fd295fbce4ca4df8c6b7: Status 404 returned error can't find the container with id 6877eadc12ecc7c279fff70e80968c284ed8bb4918c4fd295fbce4ca4df8c6b7 Apr 17 08:50:29.019048 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:29.019013 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" event={"ID":"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024","Type":"ContainerStarted","Data":"400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca"} Apr 17 08:50:29.019048 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:29.019050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" event={"ID":"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024","Type":"ContainerStarted","Data":"6877eadc12ecc7c279fff70e80968c284ed8bb4918c4fd295fbce4ca4df8c6b7"} Apr 17 08:50:30.838596 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:30.838551 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 17 08:50:31.199244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.199223 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:50:31.294075 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.294049 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d0aafed-8821-4f11-ada0-9c7678f914d2-kserve-provision-location\") pod \"6d0aafed-8821-4f11-ada0-9c7678f914d2\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " Apr 17 08:50:31.294241 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.294124 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6d0aafed-8821-4f11-ada0-9c7678f914d2-cabundle-cert\") pod \"6d0aafed-8821-4f11-ada0-9c7678f914d2\" (UID: \"6d0aafed-8821-4f11-ada0-9c7678f914d2\") " Apr 17 08:50:31.294486 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.294463 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d0aafed-8821-4f11-ada0-9c7678f914d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6d0aafed-8821-4f11-ada0-9c7678f914d2" (UID: "6d0aafed-8821-4f11-ada0-9c7678f914d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:50:31.294574 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.294529 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d0aafed-8821-4f11-ada0-9c7678f914d2-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6d0aafed-8821-4f11-ada0-9c7678f914d2" (UID: "6d0aafed-8821-4f11-ada0-9c7678f914d2"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:50:31.395366 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.395299 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d0aafed-8821-4f11-ada0-9c7678f914d2-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:50:31.395366 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:31.395333 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6d0aafed-8821-4f11-ada0-9c7678f914d2-cabundle-cert\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:50:32.027920 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.027879 2573 generic.go:358] "Generic (PLEG): container finished" podID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerID="9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12" exitCode=0 Apr 17 08:50:32.028332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.027954 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" Apr 17 08:50:32.028332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.027968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" event={"ID":"6d0aafed-8821-4f11-ada0-9c7678f914d2","Type":"ContainerDied","Data":"9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12"} Apr 17 08:50:32.028332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.028007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn" event={"ID":"6d0aafed-8821-4f11-ada0-9c7678f914d2","Type":"ContainerDied","Data":"a529c1495a475f30c86e120c19c8f9ec5e1ca850b8163971267804811e1d5291"} Apr 17 08:50:32.028332 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.028023 2573 scope.go:117] "RemoveContainer" containerID="9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12" Apr 17 08:50:32.037362 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.037345 2573 scope.go:117] "RemoveContainer" containerID="9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279" Apr 17 08:50:32.044110 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.044096 2573 scope.go:117] "RemoveContainer" containerID="9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12" Apr 17 08:50:32.044355 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:50:32.044324 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12\": container with ID starting with 9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12 not found: ID does not exist" containerID="9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12" Apr 17 08:50:32.044410 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.044363 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12"} err="failed to get container status \"9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12\": rpc error: code = NotFound desc = could not find container \"9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12\": container with ID starting with 9bdc7bbba328257962e9a71e7eb18c0a5965e09d098bef60ce63cbeebceeda12 not found: ID does not exist" Apr 17 08:50:32.044410 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.044393 2573 scope.go:117] "RemoveContainer" containerID="9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279" Apr 17 08:50:32.044579 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:50:32.044565 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279\": container with ID starting with 9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279 not found: ID does not exist" containerID="9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279" Apr 17 08:50:32.044618 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.044582 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279"} err="failed to get container status \"9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279\": rpc error: code = NotFound desc = could not find container \"9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279\": container with ID starting with 9f360b71dc04c4fcd87ef5ce7dbe7a8649a781950c4b7620cecd1702a6067279 not found: ID does not exist" Apr 17 08:50:32.048749 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.048728 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn"] Apr 17 08:50:32.051832 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:32.051811 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-lwrxn"] Apr 17 08:50:33.163432 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:33.163370 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" path="/var/lib/kubelet/pods/6d0aafed-8821-4f11-ada0-9c7678f914d2/volumes" Apr 17 08:50:35.042932 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:35.042900 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv_8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/storage-initializer/0.log" Apr 17 08:50:35.043317 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:35.042945 2573 generic.go:358] "Generic (PLEG): container finished" podID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerID="400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca" exitCode=1 Apr 17 08:50:35.043317 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:35.043019 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" event={"ID":"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024","Type":"ContainerDied","Data":"400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca"} Apr 17 08:50:36.047505 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:36.047477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv_8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/storage-initializer/0.log" Apr 17 08:50:36.047906 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:36.047555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" event={"ID":"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024","Type":"ContainerStarted","Data":"e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648"} Apr 17 08:50:38.335658 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:38.335627 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv"] Apr 17 08:50:38.336003 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:38.335835 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" containerID="cri-o://e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648" gracePeriod=30 Apr 17 08:50:39.977207 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:39.977184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv_8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/storage-initializer/1.log" Apr 17 08:50:39.977553 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:39.977536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv_8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/storage-initializer/0.log" Apr 17 08:50:39.977612 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:39.977601 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:40.051939 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.051878 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024-kserve-provision-location\") pod \"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024\" (UID: \"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024\") " Apr 17 08:50:40.052163 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.052140 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" (UID: "8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:50:40.059366 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059349 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv_8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/storage-initializer/1.log" Apr 17 08:50:40.059697 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059682 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv_8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/storage-initializer/0.log" Apr 17 08:50:40.059771 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059721 2573 generic.go:358] "Generic (PLEG): container finished" podID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerID="e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648" exitCode=1 Apr 17 08:50:40.059833 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059792 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" Apr 17 08:50:40.059833 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059794 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" event={"ID":"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024","Type":"ContainerDied","Data":"e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648"} Apr 17 08:50:40.059833 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv" event={"ID":"8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024","Type":"ContainerDied","Data":"6877eadc12ecc7c279fff70e80968c284ed8bb4918c4fd295fbce4ca4df8c6b7"} Apr 17 08:50:40.059959 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.059846 2573 scope.go:117] "RemoveContainer" containerID="e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648" Apr 17 08:50:40.067865 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.067839 2573 scope.go:117] "RemoveContainer" containerID="400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca" Apr 17 08:50:40.074579 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.074562 2573 scope.go:117] "RemoveContainer" containerID="e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648" Apr 17 08:50:40.074818 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:50:40.074798 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648\": container with ID starting with e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648 not found: ID does not exist" containerID="e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648" Apr 17 08:50:40.074885 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.074827 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648"} err="failed to get container status \"e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648\": rpc error: code = NotFound desc = could not find container \"e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648\": container with ID starting with e0b4980f8e8ca21ad07c3d37b294da1a7f94025f0c93fa05de64d1962c808648 not found: ID does not exist" Apr 17 08:50:40.074885 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.074851 2573 scope.go:117] "RemoveContainer" containerID="400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca" Apr 17 08:50:40.075074 ip-10-0-138-143 kubenswrapper[2573]: E0417 08:50:40.075055 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca\": container with ID starting with 400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca not found: ID does not exist" containerID="400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca" Apr 17 08:50:40.075123 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.075082 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca"} err="failed to get container status \"400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca\": rpc error: code = NotFound desc = could not find container \"400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca\": container with ID starting with 400381b56821d8152adcb4f2ba013a893edff412d956dbfe98069b7147ff18ca not found: ID does not exist" Apr 17 08:50:40.091244 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.091221 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv"] Apr 17 08:50:40.094587 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.094564 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-nqhxv"] Apr 17 08:50:40.152911 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:40.152887 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024-kserve-provision-location\") on node \"ip-10-0-138-143.ec2.internal\" DevicePath \"\"" Apr 17 08:50:41.162863 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:50:41.162820 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" path="/var/lib/kubelet/pods/8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024/volumes" Apr 17 08:51:06.186962 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.186886 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jbj8t/must-gather-q2j5w"] Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187106 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187117 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187126 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187132 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187142 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187148 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187159 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187164 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187213 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187222 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d0aafed-8821-4f11-ada0-9c7678f914d2" containerName="kserve-container" Apr 17 08:51:06.187488 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.187229 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b3ea2d2-3ad0-4a4e-96c8-5c08e60da024" containerName="storage-initializer" Apr 17 08:51:06.190073 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.190058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.192507 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.192486 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jbj8t\"/\"kube-root-ca.crt\"" Apr 17 08:51:06.193374 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.193357 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jbj8t\"/\"default-dockercfg-rq654\"" Apr 17 08:51:06.193374 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.193373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jbj8t\"/\"openshift-service-ca.crt\"" Apr 17 08:51:06.199694 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.199671 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/must-gather-q2j5w"] Apr 17 08:51:06.317913 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.317889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d32a312-915b-4dd1-ac17-6b1dbc9065e1-must-gather-output\") pod \"must-gather-q2j5w\" (UID: \"1d32a312-915b-4dd1-ac17-6b1dbc9065e1\") " pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.318047 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.317931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp7gl\" (UniqueName: \"kubernetes.io/projected/1d32a312-915b-4dd1-ac17-6b1dbc9065e1-kube-api-access-tp7gl\") pod \"must-gather-q2j5w\" (UID: \"1d32a312-915b-4dd1-ac17-6b1dbc9065e1\") " pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.419053 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.419021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d32a312-915b-4dd1-ac17-6b1dbc9065e1-must-gather-output\") pod \"must-gather-q2j5w\" (UID: \"1d32a312-915b-4dd1-ac17-6b1dbc9065e1\") " pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.419138 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.419066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tp7gl\" (UniqueName: \"kubernetes.io/projected/1d32a312-915b-4dd1-ac17-6b1dbc9065e1-kube-api-access-tp7gl\") pod \"must-gather-q2j5w\" (UID: \"1d32a312-915b-4dd1-ac17-6b1dbc9065e1\") " pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.419361 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.419344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d32a312-915b-4dd1-ac17-6b1dbc9065e1-must-gather-output\") pod \"must-gather-q2j5w\" (UID: \"1d32a312-915b-4dd1-ac17-6b1dbc9065e1\") " pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.426722 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.426694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp7gl\" (UniqueName: \"kubernetes.io/projected/1d32a312-915b-4dd1-ac17-6b1dbc9065e1-kube-api-access-tp7gl\") pod \"must-gather-q2j5w\" (UID: \"1d32a312-915b-4dd1-ac17-6b1dbc9065e1\") " pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.498495 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.498442 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/must-gather-q2j5w" Apr 17 08:51:06.612936 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:06.612905 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/must-gather-q2j5w"] Apr 17 08:51:06.615919 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:51:06.615889 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d32a312_915b_4dd1_ac17_6b1dbc9065e1.slice/crio-3b7b71d2088a49abef60fcc8a3bbcaa815c6201f90328ef686b1e55a19409a61 WatchSource:0}: Error finding container 3b7b71d2088a49abef60fcc8a3bbcaa815c6201f90328ef686b1e55a19409a61: Status 404 returned error can't find the container with id 3b7b71d2088a49abef60fcc8a3bbcaa815c6201f90328ef686b1e55a19409a61 Apr 17 08:51:07.139663 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:07.139631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/must-gather-q2j5w" event={"ID":"1d32a312-915b-4dd1-ac17-6b1dbc9065e1","Type":"ContainerStarted","Data":"3b7b71d2088a49abef60fcc8a3bbcaa815c6201f90328ef686b1e55a19409a61"} Apr 17 08:51:08.145364 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:08.145331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/must-gather-q2j5w" event={"ID":"1d32a312-915b-4dd1-ac17-6b1dbc9065e1","Type":"ContainerStarted","Data":"cfed625dba0de81787d10125364a990ea6493ef9c20254fd505584ac0f449f18"} Apr 17 08:51:08.145364 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:08.145367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/must-gather-q2j5w" event={"ID":"1d32a312-915b-4dd1-ac17-6b1dbc9065e1","Type":"ContainerStarted","Data":"4f54cd8c6ddb3aa8c752df307f42e9a1616c6315e82103f784e9ec28710486b3"} Apr 17 08:51:08.162148 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:08.162080 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jbj8t/must-gather-q2j5w" podStartSLOduration=1.335025931 podStartE2EDuration="2.162062791s" podCreationTimestamp="2026-04-17 08:51:06 +0000 UTC" firstStartedPulling="2026-04-17 08:51:06.617921014 +0000 UTC m=+3593.961172454" lastFinishedPulling="2026-04-17 08:51:07.444957871 +0000 UTC m=+3594.788209314" observedRunningTime="2026-04-17 08:51:08.160470068 +0000 UTC m=+3595.503721532" watchObservedRunningTime="2026-04-17 08:51:08.162062791 +0000 UTC m=+3595.505314256" Apr 17 08:51:08.977259 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:08.977227 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lm9nr_9ce8efaa-a4ae-457a-b417-1aa180ef551f/global-pull-secret-syncer/0.log" Apr 17 08:51:09.086246 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:09.086216 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-htcxl_b8ce1fa1-5996-43a1-a121-443015650e07/konnectivity-agent/0.log" Apr 17 08:51:09.228446 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:09.228342 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-143.ec2.internal_7929c28430e6e1d330dd3b56bc6070ba/haproxy/0.log" Apr 17 08:51:12.928652 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:12.928624 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nznm9_3bf17a6b-1e32-46a8-b120-2174e7e517b3/node-exporter/0.log" Apr 17 08:51:12.952211 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:12.952186 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nznm9_3bf17a6b-1e32-46a8-b120-2174e7e517b3/kube-rbac-proxy/0.log" Apr 17 08:51:12.973548 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:12.973520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nznm9_3bf17a6b-1e32-46a8-b120-2174e7e517b3/init-textfile/0.log" Apr 17 08:51:16.065575 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.065545 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z"] Apr 17 08:51:16.069693 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.069668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.081085 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.081063 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z"] Apr 17 08:51:16.200670 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.200634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xq5k\" (UniqueName: \"kubernetes.io/projected/7db7a899-33e6-4b15-81d7-8aff98afd898-kube-api-access-7xq5k\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.201127 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.200916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-sys\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.201127 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.200993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-lib-modules\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.201127 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.201027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-podres\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.201127 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.201063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-proc\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302162 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-sys\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302312 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-lib-modules\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302312 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-podres\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302312 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-sys\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302312 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-proc\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302531 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xq5k\" (UniqueName: \"kubernetes.io/projected/7db7a899-33e6-4b15-81d7-8aff98afd898-kube-api-access-7xq5k\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302531 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-proc\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302531 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302407 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-podres\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.302531 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.302416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db7a899-33e6-4b15-81d7-8aff98afd898-lib-modules\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.310049 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.310024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xq5k\" (UniqueName: \"kubernetes.io/projected/7db7a899-33e6-4b15-81d7-8aff98afd898-kube-api-access-7xq5k\") pod \"perf-node-gather-daemonset-m566z\" (UID: \"7db7a899-33e6-4b15-81d7-8aff98afd898\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.382101 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.382037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:16.474560 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.474534 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sxlhk_bcb3e19a-a695-43e0-bfdc-31eb223b9c0c/dns/0.log" Apr 17 08:51:16.500597 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.500570 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sxlhk_bcb3e19a-a695-43e0-bfdc-31eb223b9c0c/kube-rbac-proxy/0.log" Apr 17 08:51:16.501185 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.501163 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z"] Apr 17 08:51:16.505405 ip-10-0-138-143 kubenswrapper[2573]: W0417 08:51:16.505360 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7db7a899_33e6_4b15_81d7_8aff98afd898.slice/crio-ed817740639635bfeb19b9a0e4f270acb91a5b010d52291cb4fff9d2cf39d72e WatchSource:0}: Error finding container ed817740639635bfeb19b9a0e4f270acb91a5b010d52291cb4fff9d2cf39d72e: Status 404 returned error can't find the container with id ed817740639635bfeb19b9a0e4f270acb91a5b010d52291cb4fff9d2cf39d72e Apr 17 08:51:16.604372 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:16.604350 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hslwj_75720dfb-fe8b-42c4-b690-1725be056c2e/dns-node-resolver/0.log" Apr 17 08:51:17.042224 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:17.042198 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qd97n_8318f513-f957-48a0-821d-d6718c08e6cb/node-ca/0.log" Apr 17 08:51:17.180803 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:17.180725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" event={"ID":"7db7a899-33e6-4b15-81d7-8aff98afd898","Type":"ContainerStarted","Data":"13c13f110e7d044990f36ab519683f07d9e42f1520fbd57dc9994b2f416d8478"} Apr 17 08:51:17.180803 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:17.180764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" event={"ID":"7db7a899-33e6-4b15-81d7-8aff98afd898","Type":"ContainerStarted","Data":"ed817740639635bfeb19b9a0e4f270acb91a5b010d52291cb4fff9d2cf39d72e"} Apr 17 08:51:17.180803 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:17.180801 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:17.196920 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:17.196859 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" podStartSLOduration=1.196840312 podStartE2EDuration="1.196840312s" podCreationTimestamp="2026-04-17 08:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:51:17.195816935 +0000 UTC m=+3604.539068422" watchObservedRunningTime="2026-04-17 08:51:17.196840312 +0000 UTC m=+3604.540091767" Apr 17 08:51:18.054711 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:18.054677 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pw7cp_ce7f44b6-f8a9-4abf-b749-3dc63b29e396/serve-healthcheck-canary/0.log" Apr 17 08:51:18.553624 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:18.553595 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4lq6_d17df228-c496-4bd6-9af7-4ceb036c7530/kube-rbac-proxy/0.log" Apr 17 08:51:18.572487 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:18.572465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4lq6_d17df228-c496-4bd6-9af7-4ceb036c7530/exporter/0.log" Apr 17 08:51:18.591234 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:18.591214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4lq6_d17df228-c496-4bd6-9af7-4ceb036c7530/extractor/0.log" Apr 17 08:51:20.487728 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:20.487685 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-558564fd68-rwgw4_b2b6b903-ecdb-4bd5-be31-5c280499ade5/manager/0.log" Apr 17 08:51:20.505414 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:20.505367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-phzzw_8a25fee9-99e3-4187-b81d-e2f9802f42d2/manager/0.log" Apr 17 08:51:23.196613 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:23.195992 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-m566z" Apr 17 08:51:26.401087 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.401058 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/kube-multus-additional-cni-plugins/0.log" Apr 17 08:51:26.421584 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.421561 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/egress-router-binary-copy/0.log" Apr 17 08:51:26.441891 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.441872 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/cni-plugins/0.log" Apr 17 08:51:26.461115 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.461098 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/bond-cni-plugin/0.log" Apr 17 08:51:26.479834 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.479808 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/routeoverride-cni/0.log" Apr 17 08:51:26.499363 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.499347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/whereabouts-cni-bincopy/0.log" Apr 17 08:51:26.519264 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.519246 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tbd6q_8e46b3b0-2ff5-430f-acab-a20e11bb02d0/whereabouts-cni/0.log" Apr 17 08:51:26.617065 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.617013 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfldz_c0a8fdbe-d345-4229-a451-b516b5f45e25/kube-multus/0.log" Apr 17 08:51:26.685909 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.685875 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jvr45_7e501027-4496-4a12-a9d5-fc5c57942102/network-metrics-daemon/0.log" Apr 17 08:51:26.703435 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:26.703411 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jvr45_7e501027-4496-4a12-a9d5-fc5c57942102/kube-rbac-proxy/0.log" Apr 17 08:51:27.891358 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:27.891324 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/ovn-controller/0.log" Apr 17 08:51:27.922901 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:27.922837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/ovn-acl-logging/0.log" Apr 17 08:51:27.942036 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:27.942015 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/kube-rbac-proxy-node/0.log" Apr 17 08:51:27.962801 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:27.962779 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:51:27.983476 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:27.983454 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/northd/0.log" Apr 17 08:51:28.006528 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:28.006509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/nbdb/0.log" Apr 17 08:51:28.032761 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:28.032737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/sbdb/0.log" Apr 17 08:51:28.143240 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:28.143218 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwtmq_6b3477ff-e316-477b-998e-8681a1f30139/ovnkube-controller/0.log" Apr 17 08:51:29.449905 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:29.449876 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t6snn_129c1cb1-8484-40fe-b434-4354aab1d142/network-check-target-container/0.log" Apr 17 08:51:30.301014 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:30.300967 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tsfwr_8938afcf-2b01-434e-9adb-48a0b9891ff1/iptables-alerter/0.log" Apr 17 08:51:30.951859 ip-10-0-138-143 kubenswrapper[2573]: I0417 08:51:30.951827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wsbdc_86ccc290-4522-4b50-9bf3-c06aee8a24d6/tuned/0.log"